Post a job

Senior Data Engineer

Brado logo

Location
United States
Brado

Job Description

About us:

Brado is a digital marketing agency reinventing the way healthcare brands engage with people. Driven by insight, we offer precision engagement solutions that produce superior returns for our healthcare clients and better experiences for their healthcare customers. 

Our Values:

At Brado, we value the individual. We believe work and life can be synergistic and should not be at odds. The joy and renewal you get from each source must fuel the other. We have and will continue to cultivate a team who celebrates our diversity of thoughts, beliefs, backgrounds, and lifestyles. We are driven by our passion to do great work with great clients that are truly changing lives.

The Role:
The Senior Data Engineer co-owns the data strategy and architects the right data platform to serve business needs. They lead the development of data pipelines and data products necessary to enable analytics teams to accomplish their goals. They contribute to the vision for developing our modern data infrastructure in Data Bricks. They work closely with fellow engineers, data scientists, and reporting and measurement specialists to establish best practices for creating systems and data products that the business will use.

Ideal candidates for this role will live in the St. Louis, MO or Dallas/Ft. Worth TX areas. While our day-to-day work is done remotely, our teams gather in person for intentional work.

Key Areas of Responsibility 

  • Drive automation efforts across the data analytics team utilizing Infrastructure as Code (IaC) using Terraform and MSFT Bicep, Configuration Management, and Continuous Integration (CI) / Continuous Delivery (CD) tools such as Jenkins.
  • Work with internal infrastructure teams on monitoring, security, and configuration of azure environment and applications as it relates to data infrastructure and Data Bricks.
  • Identify data needs for our clients, our marketing team, and data science team, understand specific requirements for metrics and analysis, and build efficient and scalable data pipelines to deliver efficient data-driven products. 
  • Design, develop and maintain marketing databases, datasets, pipelines, and warehouses to enable advanced segmentation, targeting, automation, and reporting. 
  • Facilitate data integration and transformation requirements for moving data between applications; ensuring interoperability of applications with database, data warehouse, and data mart environments. 
  • Assist with the design and management of our technology stack used for data storage and processing. 
  • Develop and implement quality controls and departmental standards to ensure quality standards, organizational expectations, and regulatory requirements. 
  • Contribute to the development and education plans on data engineering capabilities, systems, standards, and processes. 
  • Anticipate future demands of initiatives related to people, technology, budget and business within your department and design/implement solutions to meet these needs. 
  • Communicate results and business impacts of insight initiatives to stakeholders within and outside of the company. 

Requirements

  • 5 years of experience with modern data engineering projects and practices: designing, building, and deploying scalable data pipelines with 3 + years of experience deploying cloud native solutions.
  • 2+ years of experience using Data Bricks, lakehouse architecture and delta lake
  • Strong programming skills in Python, Java, or Scala, and their respective standard data processing libraries
  • 3 years of experience building data pipelines for AI/ ML models using PySpark or Python
  • Experience building data pipelines with modern tools such as Fivetran, dbt etc. 
  • At least 2 years of experience with Azure, SQL, Python, Docker/Kubernetes, CI/CD, Git
  • Experience with Spark, Kafka, etc. 
  • Experienced in integrating data from core platforms like Marketing Automation, CRM, and Analytics into a centralized warehouse.
  • Software development best practices with strong rigor in high quality code development, automated testing, and other engineering best practices
  • Familiarity with Azure services like Azure functions, Azure Data Lake Store, Azure Cosmos, Azure Databricks, Azure Data Factory etc.
  • Masters Degree or equivalent experience in Computer Science, Engineering, Statistics, Informatics, Information Systems or another quantitative field

Benefits

  • Health Care Plan (Medical, Dental & Vision)
  • Retirement Plan (401k, IRA)
  • Life Insurance (Basic, Voluntary & AD&D)
  • Paid Time Off (Vacation, Sick & Public Holidays)
  • Family Leave (Maternity, Paternity)
  • Short Term & Long Term Disability
  • Training & Development
  • Work From Home

Apply for this job

Expired?

Please let Brado know you found this job with RemoteJobs.org. This helps us grow!

RemoteJobs.org mascot