Data Operations Engineer – Research & Scale

The Data Operations Engineer’s primary responsibility is to support all data pipelines, maintain their & monitoring infrastructure, data stores, and ETL tasks. We will work on improving reliability, scalability and efficiency of data pipelines developed by our data engineering, and analytics teams. You have Experience working on cloud platform and hands-on with Big Data analytics. You Demonstrate technical, team & solution leadership by communicating strongly through actionable, data-driven insights.

Responsibilities:

  • Responsible for the stability and monitoring of our Data Pipeline
  • Building end-to-end instrumentation and alerting system to detect and alert any anomaly in the system or in the data
  • Configuring and managing the platform
  • Building a full proof secure data access layer with auditing in place
  • Ensuring day-to-day execution of data pipeline
  • Explore and integrate new big data technologies and software engineering tools into current infrastructures
  • Research opportunities for data acquisition and new uses for existing data
  • Collaborate with Data and IT team members in enhancing the data platform
  • Monitoring data collection, storage and retrieval processes
  • Develop a good understanding of the Business with its various domains
  • Contribute to the open source community

Requirements:

  • Background: Fields of study is Computer Science related
  • Experience 3 to 5 years
  • DevOPS with BigData Stack Experience and working experience on Analytics project
  • Working experience on a Google Cloud Platform is a plus
  • Tools & Technologies: Grafana, Elastic Search, Fluent.d, Kibana, Kafka, Redis, Hadoop, Hive, NoSQL, SQL databases
  • Experience gathering and analyzing system requirements
  • Hands on experience in SQL and programming language: Python, Java, Scala or Go
  • In-depth understanding of database structure principles, data warehousing, data mining concepts, and segmentation techniques
  • Experience with cloud computing platform (AWS, GCP, etc.) and UNIX environment
  • Experience in designing, implementing, and monitoring big data analytics solutions
  • Have fast learning capability and natural curiosity about big data

 

If this sounds like the kind of opportunity you’ve been looking for, then we’re going to need your resume of course, but more importantly include a short note giving us a sense of why you think you are absolutely the right person for this job. Write to to get in touch!

Location: Noida

Apply Now