Type of Requisition:
RegularClearance Level Must Currently Possess:
NoneClearance Level Must Be Able to Obtain:
NoneSuitability:
Public Trust/Other Required:
NoneJob Family:
Data ScienceJob Qualifications:
Skills:Agile Methodology, Amazon Web Services (AWS), Data Visualization, Software Platforms, Structured Query Language (SQL)
Certifications:
AWS Cloud Practitioner - AWSExperience:
2 + years of related experienceUS Citizenship Required:
NoJob Description:
Seize your opportunity to make a personal impact as a Data Engineersupporting the CMS Cloud team. GDIT is your place to make meaningful contributions to challenging projects and grow a rewarding career.
At GDIT, people are our differentiator. As a Data Engineerour work depends on designing, developing, maintaining and testing infrastructure for data generation as well as optimizing data flow and collection for cross functional teams.
Program Overview: GDIT is seeking professionals to support the CMS Cloud team in delivering brokerage services for infrastructure teams including the following: Cloud services from CSPs includes Microsoft Azure Government (MAG) and Amazon Web Services (AWS); Equinix data center leased space; hardware; Software; OEM professional services; and asset and financial management services. Our product-driven team works directly with CMS and their Hybrid Cloud application teams to ensure efficient and cost-effective cloud & product use. Our team is responsible for financial management tools CMS application developers use to optimize and manage their cloud applications. Successful candidates are eager to take the initiative on a large Cloud resell program where optimization and attention to detail is crucial.
HOW YOU WILL MAKE AN IMPACT:
- Contributes to completion of cost tool development efforts by ensuring all financial data is collected, maintained, stored, and optimized to flow into various teams.
- Creates and maintains optimal data pipeline architecture.
- Analyzes IT environments to identify and assess critical capabilities and recommend solutions to problems of moderate scope and complexity.
- Optimize data for performance, scalability, security, and cost-efficiency in cloud environments.
- Analyzes customer requirements and provides data according to the client's wide range of moderately complex specifications and needs
- Maintains and recommends moderately complex updates to data ensuring changes are implemented without disruption
- Creates and implements contingency plans in preparation of potential cloud service outages
- Collaborates with cost tool engineering team to transform data and integrate algorithms and models into automated processes.
- Design, develop, and maintain API structures
- Write clean, maintainable, and testable code.
- Develops and conducts thorough testing to ensure data solutions meet documented user requirements throughout data lifecycle.
- Work with AWS Storage and Analytics services (S3, Glue, Athena, QuickSight, Lake Formation, etc.)
- Work with AWS Compute and Management services (Lambda, Fargate, CloudFormation, CloudWatch, etc.)
- Builds upon current knowledge of relevant programming that will enhance overall functionality
- Performs additional duties as assigned
WHAT YOU’LL NEED TO SUCCEED:
- Education: B.S in Computer Science or a closely related field or equivalent combination of education, professional training or work experience
- Required Experience: 2+ years’ experience
- Required Technical Skills: Experience with AWS including RDS, Redshift, DynamoDB, Lambda, Athena, S3, Glue, or Lake Formation; Experience transforming data for use in Data Visualization tool such as Amazon QuickSight, Tableau, PowerBI, etc.; Experience in writing complex SQL scripts using Trino, Presto, Postgres SQL, Oracle, AWS Redshift, or other relational Databases; Experience with task tracking and documentation tools such as Jira or Confluence
- Security Clearance Level: No clearance needed
- Required Skills and Abilities: Strong analytical and problem-solving skills; Excellent communication skills, with the ability to present information to stakeholders; Ability to work independently and as part of a cross-functional team
- Preferred Skills: AWS Certification; Bachelors degree in Computer Science; Experience with AWS including Redshift, DynamoDB, Athena, S3, Glue, Lake Formation, et al.; Working knowledge with software platforms and services, such as, Docker, Kubernetes, SQS, SNS, Kafka, NiFi, Airflow, or similar; Experience with Linux, Python, Git preferred; Familiarity with Agile methodology
- Location: 100% remote working; DMV preferred; Program resources must be willing to travel occasionally to the MD/VA/DC area on an as needed basis.
- US Residency: Applicants shall have lived in the United States at least three (3) out of the last five (5) years prior to submitting an application for a Federal ID Card
- Working Hours: Monday-Friday 9am to 5:30pm eastern time. All resources on the program must support occasional incident/issue during non-core business hours.
GDIT IS YOUR PLACE:
- 401K with company match
- Comprehensive health and wellness packages
- Internal mobility team dedicated to helping you own your career
- Professional growth opportunities including paid education and certifications
- Cutting-edge technology you can learn from
- Rest and recharge with paid vacation and holidays
Scheduled Weekly Hours:
40Travel Required:
Less than 10%Telecommuting Options:
RemoteWork Location:
Any Location / RemoteAdditional Work Locations: