About the Client:
The client specializes in digital transformation, offering solutions that empower businesses to embrace innovation and operational excellence. They focus on integrating digital platforms, designing personalized customer experiences, and streamlining data-driven decisions. By working with this client, professionals engage in transforming industries through technology, collaborating on projects that push the boundaries of digital engineering and data analytics. This role provides a platform to enhance career development within a dynamic, future-focused industry.
Job Description:
The client is actively seeking a self-motivated, enthusiastic, and experienced Senior Software Engineer to develop solutions for their proprietary marketplace product. This position requires a professional who thrives in a technically demanding role, designing and developing with the latest technologies in web applications, data pipelines, big data, machine learning, and multi-cloud environments.
Key Responsibilities:
- Design and build robust ingestion pipelines using ELT and schema on read in Databricks Delta Lake.
- Develop transformations within ELT frameworks to modernize ingestion pipelines and scale data transformations.
- Provide technical expertise in the design and implementation of Ratings Data Ingestion pipelines utilizing modern AWS cloud technologies such as S3, Hive, Databricks, Scala, Python, and Spark.
- Maintain a high-performance data environment focused on speed, accuracy, consistency, and uptime.
- Collaborate with data and Data Science teams to develop efficient ingestion pipelines.
- Implement data governance, quality checks, and data lineage throughout the data handling processes.
- Stay abreast of emerging trends in big data and cloud technologies, evaluating new technologies for potential adoption.
- Ensure compliance with enterprise standards and best practices aligned with organizational standards.
Requirements:
- Minimum of 8 years of significant experience in application development with a strong background in system architecture, object-oriented design, and design patterns.
- Expertise in streaming pipelines, particularly with Kafka and Spark structured streaming.
- Proficient in SDLC methodologies such as Agile and test-driven development.
- Experience in handling high-volume data and computationally intensive systems.
- Knowledge of tuning Java garbage collection and performance optimization.
- Proficiency in development environments including IDE, web & application server, GIT, CI tools, unit-testing, and defect management tools.
- Domain knowledge in the Financial Industry and Capital Markets is highly desirable.
- Excellent communication skills with strong verbal and writing proficiencies.
What the Client Offers:
- An unmatched experience in handling massive volumes of data and advanced analytics.
- The opportunity to work on converting innovative ideas into revenue-generating streams.
- A dynamic work environment where one can mentor teams, innovate, and experiment.
- The chance to give a face to business ideas and present to key stakeholders.
Apply Today:If driven by innovation and ready to be part of a transformative team, applicants are encouraged to submit their resume and a cover letter outlining their qualifications and why they are the best fit for this role.