Remote JobsRemote CompanyBlog
Sign In
Sign Up
Back to all jobs

Sr Big Data Engineer (GCP)- Airflow and Oozie

remote,India - Remote
Data

Description

About the Role:
We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

What you will be doing
Build a reusable, and reliable code for stream and batch processing systems at scale. This includes working with technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark. Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc.

About the Role:
We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing and scaling both stream and batch processing systems, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.

What you will be doing
Build a reusable, and reliable code for stream and batch processing systems at scale. This includes working with technologies like Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark. Implementing automation/DevOps best practices for CI/CD, IaC, Containerization, etc.

Requirements:

  • Job Title: Sr Big Data Engineer (GCP) - Airflow and Oozie
  •  
  • About the Role:
  • We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing batch processing systems, with extensive experience in Oozie, the Apache Hadoop ecosystem, Airflow, and a solid understanding of public cloud technologies, especially GCP.  This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
  •  
  • What you will be doing
  • -          Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
  • -          Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem
  • -          Leverage GCP for scalable big data processing and storage solutions
  • -          Implementing automation/DevOps best practices for CI/CD, IaC, etc.
  • Requirements:
  • -       Experience with GCP managed services and understanding of cloud-based batch processing systems are critical.
  • -       Proficiency in Oozie, Airflow, Map Reduce, Java
  • -       Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
  • -       Expertise in public cloud services, particularly in GCP.
  • -       Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
  • -       Familiarity with BigTable and Redis
  • -       Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
  • -       Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions.
  • -       Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals.
  • -       Proven experience in engineering batch processing systems at scale.
  • -       Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.


  • Must Have:

  • Must Have
  • -       Google Associate Cloud Engineer Certification or other Google Cloud Professional level certification
  • -       10+ years of experience in customer-facing software/technology or consulting
  • -       5+ years of experience with “on-premises to cloud” migrations or IT transformations
  • -       5+ years of experience building, and operating solutions built on GCP (ideally) or AWS/Azure
  • -       Technical degree: Computer Science, software engineering or related

  •  Apply this job
    Please mention that you found this job on remotewlb.com. Thanks & good luck!
     Apply
     Save
    Share to :

    Rackspace

    New Job Alert

    COMING SOON~
    Follow us on
    Give a ⭐ on
    Similar Jobs
    Find more remote jobs
    Do you love using our product?

    Share a testimonial/suggestion.We'd love to hear about it!

    Click to submit✍️
    logo of sitemark

    Copyright © RemoteWLB 2025

    Remote Dev JobsRemote Support JobsRemote Design JobsRemote Sales JobsRemote Product JobsRemote Business JobsRemote Data JobsRemote Devops JobsRemote Finance JobsRemote Legal JobsRemote HR JobsRemote QA JobsRemote Write JobsRemote Edu JobsRemote Market JobsRemote Management JobsRemote Others Jobs