RemoteJobs.org mascotRemoteJobs.org
Remote JobsCompaniesAPIPost a Job
RemoteJobs.org mascotRemoteJobs.org

Find your dream remote job. Browse thousands of remote positions from top companies worldwide.

Job Categories

  • Programming
  • Design
  • Marketing
  • Sales
  • Customer Support
  • Writing

Resources

  • Browse Jobs
  • Companies
  • Post a Job
  • For Developers

Company

  • About Us
  • Contact
  • Privacy Policy
  • Terms of Service
© 2026 RemoteJobs.org. All rights reserved.
    ← Back to all jobs
    1950Labs

    Data Architect #6632

    1950Labs
    Full-time
    RemoteProgrammingToday

    About this role

    Client Description

    The client is a global organization in the tourism industry, offering river, ocean, and expedition cruises for passengers worldwide and operating a large fleet of vessels.

    The company is currently undergoing an extensive cloud data modernization and unification program. We support them across data architecture, BI, migration, and data platform development.

    A key focus area is the migration to Databricks Unity Catalog, including:

    • Migrating all data layers (landing, raw, prepared, reporting, services) from Hive Metastore to Unity Catalog

    • Migrating DLT (Delta Live Tables) and Python/SQL jobs into Databricks

    • Migrating pipelines in Azure Synapse/ADF

    • Rebuilding and adapting metadata frameworks

    • Standardizing access, lineage, governance, and overall Lakehouse structure The client has very high technical expectations and is looking for top-level specialists capable of leading complex architectural initiatives.

    Technical Requirements

    • Advanced knowledge of Microsoft Azure (data infrastructure, networking, authorization, cloud design)

    • Experience with Azure Synapse (especially Synapse Serverless and pipelines)

    • Strong expertise in Databricks (DLT, workflows, workspace administration)

    • Ability to design Data Lakehouse architectures (Medallion Architecture, Metadata-Driven ETL)

    • Very good knowledge of Python and code optimization

    • Strong SQL skills, including query optimization and SQL Server experience

    • Experience with Apache Spark (data processing workflows)

    • Experience building ETL/ELT processes and data warehouses

    • Experience with CI/CD processes (Azure DevOps, Git, branching strategies)

    • Experience implementing logging, monitoring, and optimization of data processes

    • Familiarity with Power BI and analytics workflows

    • Strong communication skills and documentation ability

    • Experience as a Lead Engineer (technical leadership, decision-making, stakeholder collaboration) Scope of Responsibilities

    • Design and develop data architecture in Azure and Databricks environments

    • Participate in the Unity Catalog transformation (infrastructure, pipelines, frameworks, standards)

    • Migrate and modernize ETL/ELT processes (DLT, Python/SQL jobs, Synapse/ADF pipelines)

    • Design and implement Data Lakehouse solutions using Medallion architecture

    • Optimize data processing workflows (Python, SQL, Spark)

    • Build CI/CD processes and automation in Azure DevOps

    • Implement standards for logging, monitoring, and data quality

    • Lead the project from a technical perspective (Lead Engineer role)

    • Collaborate with business and technical stakeholders

    • Document solutions and mentor team members

    About 1950Labs

    1950Labs
    1950Labs

    Related Jobs

    Talent Acquisition Manager

    CommandLink

    Manager/Sr. Manager, Clinical GCP Quality Assurance

    Rakuten Medical, Inc. · USD 150,000 - 170,000

    Staff Product Designer, Pay Ecosystem

    Remote · USD 70,700 - 159,050