Post a job

Sr Big Data Engineer Airflow and Oozie (GCP)

Rackspace logo

Location
Canada
Rackspace

Job Description

About the Role:

We are seeking a highly skilled and experienced Senior Big Data Engineer to join our dynamic team. The ideal candidate will have a strong background in developing batch processing systems, with extensive experience in Oozie, the Apache Hadoop ecosystem, Airflow, and a solid understanding of public cloud technologies, especially GCP. This role involves working in a remote environment, requiring excellent communication skills and the ability to solve complex problems independently and creatively.
Work Location: Canada-Remote

Key Responsibilities:

  • Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase.
  • Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem.
  • Leverage GCP for scalable big data processing and storage solutions.
  • Implementing automation/DevOps best practices for CI/CD, IaC, etc.

Qualifications:

  • Bachelors's degree in Computer Science, software engineering or related field of study.
  • Experience with managed cloud services and understanding of cloud-based batch processing systems are critical.
  • Proficiency in Oozie, Airflow, Map Reduce, Java.
  • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL.
  • Expertise in public cloud services, particularly in GCP.
  • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce.
  • Familiarity with BigTable and Redis.
  • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
  • Proven experience in engineering batch processing systems at scale.

Must Have:

  • 5+ years of experience in customer-facing software/technology or consulting.
  • 5+ years of experience with “on-premises to cloud” migrations or IT transformations.
  • 5+ years of experience building, and operating solutions built on GCP (ideally) or AWS/Azure.
  • Proficiency in Oozie, Airflow, Map Reduce, Java.

About Rackspace Technology

We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future.

More on Rackspace Technology

Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.

Advice from our career coach

As a Senior Big Data Engineer at Rackspace Technology, the successful candidate should have a deep understanding of batch processing systems, the Apache Hadoop ecosystem, Airflow, and be proficient in Google Cloud Platform (GCP). Here are some tips to stand out as an applicant:

  • Highlight your experience developing batch processing systems using Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, and Hbase.
  • Showcase your expertise in managing and optimizing data workflows with Oozie and Airflow.
  • Demonstrate your proficiency in GCP and your experience with cloud-based batch processing systems.
  • Emphasize your knowledge of automation/DevOps best practices for CI/CD and IaC.
  • Provide examples of how you have applied Infrastructure and Applied DevOps principles in your work, including tools like Terraform.

Apply for this job

Expired?

Please let Rackspace know you found this job with RemoteJobs.org. This helps us grow!

RemoteJobs.org mascot