Post a job

Data Engineer II

Bestow logo

Location
United States
Base Salary
100k-125k USD
Bestow

Job Description

ABOUT BESTOW

Bestow is a leading insurance technology platform serving some of the world's largest and most innovative life insurers. We are on a mission to increase financial stability for everyone. Bestow is a team of mission-driven, results-oriented individuals. We offer all employees a remote (contiguous 48 only)/hybrid workplace, meaningful benefits, substantial growth opportunities, and equity.
Bestow participates in the E-Verify Program.

ABOUT THE TEAM

The Bestow Data Engineering team plays a significant role within the organization, working across the entire company to provide scalable data solutions within the platform and toward integrations with external partners. The data engineering team works closely with internal analytics team members in order to improve data architecture and serve data science predictions. In addition, data engineers work closely with stakeholders and members of product and engineering to design and launch new systems for extracting, transforming, and storing data. You’ll be called upon to improve Bestow’s data reliability, efficiency, and quality and will be expected to scale your solutions to the cloud environment of a SaaS company, iterate quickly, and make pragmatic choices around what tools and technologies to adopt.

ABOUT THE ROLE

  • Build robust solutions for transferring data from first and third-party applications to and from our data warehouse
  • Passionate about data quality and availability, driving to resolution through high collaboration with team members.
  • Making decisions as a team. The things you build will be maintained and improved upon by others; there is a shared responsibility to make defensible design considerations.
  • Develop hardened and repeatable (CI/CD) data models and pipelines to enable reporting, modeling, and machine learning.
  • Improve data availability to our enterprise clients through a mix of traditional push delivery, cloud, and event-driven (eg: API, grpc) data sharing methods.
  • Ensure data quality through automated monitoring and alerting, and occasionally serving within an on-call rotation.
  • Leverage Google Cloud (GCP) tools (eg: Cloud Run, Cloud Function, Vertex AI, App Engine, Cloud Storage, IAM, etc.) and services (eg: Astronomer - Apache Airflow) to bring data workloads to production.
  • Collaborate with product, engineering, stakeholders and data teams to deliver informed solutions to platform and client needs.

THIS ROLE REPORTS TO

  • VP, Data Analytics
  • Open to Dallas, TX, or Remote (US)

YOUR EXPERIENCE

  • 4+ years working in a data engineering role that supports incoming/outgoing feeds as well as analytics and data science teams
  • 2+ years of Python or similar experience writing efficient, testable, and readable code
  • Deep SQL experience with columnar databases such as Google BigQuery, Snowflake, or Amazon Redshift
  • Comfortable designing an end-to-end data pipeline in cloud frameworks (such as GCP, AWS, Azure) with requirements from multiple stakeholders
  • Experience building CICD pipelines for data processing using tools such as Docker, CircleCI, dbt, git, etc
  • Able to manage infrastructure using IAC tools such as Terraform or Pulumi
  • Experience with common data orchestration tools such as Apache Airflow (or similar) to manage SLOs and processing dependencies
  • Experience in building streaming / real-time ingestion pipelines
  • Experience with creating alerts and monitoring pipelines which contribute to overall data governance.
  • Experience with containerization and container orchestration technologies with cloud architecture and implementation features (single- and multi-tenancy, orchestration, elastic scalability)
  • Familiarity with standard IT security practices such as identity and access management (IAM), data protection, encryption, certificate, and key management.
  • Adaptability to learn new technologies and products as the job demands.
  • Nice to have:experience with data contracts, data lakes, and API development.

TOTAL REWARDS

At Bestow, we’re proud to be awarded for our team members, innovative products, and culture. Our standard benefits include:
Competitive salary and equity based on role
Policies and managers that support work/life balance, like our flexible paid time off and parental leave programs
100% paid-premium option for medical, dental, and vision insurance
Lifestyle stipend to support your physical, emotional, and financial wellbeing
Flexible work-from-home policy and open to remote
Remote and WFH options, as well as a beautiful, state-of-the-art office in Dallas’ Deep Ellum, for those who prefer an office setting
Employee-led diversity, equity, and inclusion initiatives
Recent Employer Awards include:
Best Place for Working Parents 2023
Great Place to Work Certified, 2022 + 2023 + 2024
Built In Best Places to Work, 2022 + 2023
Fortune’s Best Workplaces in Texas 2022 + 2023
Fortune’s Best Workplaces in Financial Services and Insurance 2022 + 2023
We value diversity at Bestow. The company will recruit, hire, and promote regardless of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, pregnancy or maternity, veteran status, or any other status protected by applicable law. We understand the importance of creating a safe and comfortable work environment and encourage individualism and authenticity in every team member.
Thanks for considering a job at Bestow.

Advice from our career coach

As someone who has worked in a data engineering role supporting incoming/outgoing feeds, analytics, and data science teams, I can offer valuable insights on how to stand out as an applicant for the Data Engineering team at Bestow:

  • Highlight your experience with Python or a similar language for writing efficient, testable, and readable code.
  • Showcase your deep SQL experience with columnar databases like Google BigQuery, Snowflake, or Amazon Redshift.
  • Emphasize your ability to design end-to-end data pipelines in cloud frameworks (GCP, AWS, Azure) with input from multiple stakeholders.
  • Demonstrate your experience in building CICD pipelines for data processing using tools such as Docker, CircleCI, dbt, and git.
  • Describe your familiarity with common data orchestration tools like Apache Airflow to manage SLOs and processing dependencies.
  • Highlight your experience in building streaming/real-time ingestion pipelines and creating alerts for monitoring data quality.
  • Showcase your knowledge of containerization and container orchestration technologies in cloud architecture.
  • Mention any experience with IT security practices, such as IAM, data protection, encryption, and key management.
  • Highlight your adaptability and willingness to learn new technologies and products as needed for the role.
  • If applicable, mention any experience with data contracts, data lakes, and API development as nice-to-have skills.

Apply for this job

Expired?

Please let Bestow know you found this job with RemoteJobs.org. This helps us grow!

About the job

Nov 12, 2024

Full-time

100k-125k USD

  1. US United States

More remote jobs at Bestow

RemoteJobs.org mascot