Post a job

Data Engineer / Architect

A

Location
United States
Amivero

Job Description

Job TypeFull-timeDescription

The Amivero Team

Amivero’s team of IT professionals delivers digital services that elevate the federal government, whether national security or improved government services. Our human-centered, data-driven approach is focused on truly understanding the environment and the challenge, and reimagining with our customer how outcomes can be achieved.

Our team of technologists leverage modern, agile methods to design and develop equitable, accessible, and innovative data and software services that impact hundreds of millions of people.

As a member of the Amivero team you will use your empathy for a customer’s situation, your passion for service, your energy for solutioning, and your bias towards action to bring modernization to very important, mission-critical, and public service government IT systems.

Special Requirements

  • US Citizenship required to obtain Public Trust
  • Active Department of Homeland Security Public Trust, EOD, OR Top Secret clearance preferred
  • Bachelors + 2-4 years of relevant experience
  • Python, Airflow, Terraform, RDS, Databricks experience preferred

The Gist...

Our Data Engineer/ Architect will work as a part of a team responsible for developing enterprise grade data platforms, services, and pipelines. You will use your passion for data, background in problem solving, and customer engagement to advance the build of this enterprise environment.

What Your Day Might Include...

  • Design and implement efficient data platforms to handle structured and unstructured data.
  • Familiarity with DataOps concepts and automation tools like terraform, Apache Airflow, Cloudbees, Cloudformation templates and git actions.
  • Experience with setting up data pipelines with inline verification and validation using open-source libraries, preferably Python.
  • Determining analytical and effective data design for cutting-edge business solution.
  • Familiarity with BI and analytical reporting tools such as Tableau, Jasper Reports
  • Familiarity with relational databases (Oracle/PostgreSQL) for deciphering relationships and sourcing data.
  • Evaluating and designing data models efficiently and making optimization recommendations.
  • Interacting and working closely with other IT units to improve computing environment.
  • Experience with APIs (REST, SOAP, JDBC) and formats (JSON, XML, csv)
  • Be updated with latest trends and technology practices in Data including Data Mesh.
  • Familiarity with AWS GovCloud, IAM Security, RDS, Role Based Access Control (RBAC) and Docker Migrate environments with performance and reliability.
  • Lead and design the migration of data environments, ensuring optimal performance and reliability.
  • Evaluate and analyze ETL jobs, workflows, BI tools, and reports.
  • Respond to technical inquiries regarding customization, integration, enterprise architecture, and general features/functionality of data products.
Requirements

You'll Bring These Qualifications...

  • 2-4 years of experience working with MS SQL Server and SSIS to build ETL pipelines
  • 2-4 years industry experience coding commercial software and a passion for solving complex problems.
  • 2-4 years direct experience in Data Engineering with experience in tools such as:
    • Big data tools: Hadoop, Spark, Kafka, etc.
    • Relational SQL and NoSQL databases, including Postgres and Cassandra.
    • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
    • AWS cloud services: EC2, EMR, RDS, Redshift (or Azure equivalents)
    • Data streaming systems: Storm, Spark-Streaming, etc.
    • Search tools: Solr, Lucene, Elasticsearch
    • Object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring and optimization (SQL) as well as working familiarity with a variety of databases.
  • Experience with message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Experience manipulating, processing, and extracting value from large, disconnected datasets.
  • Experience manipulating structured and unstructured data for analysis
  • Experience constructing complex queries to analyze results using databases or in a data processing development environment
  • Experience with data modeling tools and process
  • Experience architecting data systems (transactional and warehouses)
  • Experience aggregating results and/or compiling information for reporting from multiple datasets
  • Experience working in an Agile environment
  • Experience supporting project teams of developers and data scientists who build web-based interfaces, dashboards, reports, and analytics/machine learning models

EOE/M/F/VET/DISABLED All qualified applicants will receive consideration without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state and local laws. Amivero complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities.

Apply for this job

Expired?

Please let Amivero know you found this job with RemoteJobs.org. This helps us grow!

About the job

Jul 14, 2024

Full-time

  1. US United States

More remote jobs at Amivero

RemoteJobs.org mascot