Engineering

Databricks Solution Architect (Remote)

Remote
Work Type: Full Time

Job Title: Databricks Solution Architect

Experience: 10+ years

 

About Us

At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.


Senior / Expert Databricks Solutions Architect — Job Description

About the role

We are seeking a highly experienced Solutions Architect to lead the design and delivery of end-to-end data and analytics solutions on the Databricks platform. You will translate complex business needs into scalable, secure, and cost-efficient data lakehouse architectures, collaborate with cross-functional teams, and guide customers from concept through implementation and adoption.

What you’ll do (Responsibilities)

• Engage with business stakeholders to understand goals, data sources, and analytic use cases; translate into a holistic Databricks-based solution.

• Design scalable data lakehouse architectures on Databricks (Delta Lake, Databricks SQL, Unity Catalog, Delta Live Tables) that support data ingestion, cleansing, modeling, governance, security, and analytics.

• Lead technical architecture decisions and produce high-quality artifacts (reference architectures, solution blueprints, data models, data governance models, and integration plans).

• Architect data pipelines end-to-end (ingestion, transformation, storage, cataloging) with best practices for reliability, observability, and cost optimization.

• Enable data science and ML workflows on Databricks (MLflow, feature store, notebooks, Automated ML) and design end-to-end MLOps strategies.

• Ensure data governance, security, and compliance (IAM, encryption, Unity Catalog, data masking, lineage, access controls).

• Collaborate closely with data engineers, data scientists, software engineers, and DevOps to deliver production-ready solutions; implement CI/CD for data and ML pipelines.

• Lead customer-facing activities: workshops, solution demos, proofs of concept, and responses to RFPs/RFIs; provide strategic guidance on platform adoption and ROI.

• Mentor and coach junior architects and engineers; develop training materials and run knowledge-sharing sessions.

• Monitor performance, optimize SQL and Spark workloads, manage cluster configurations, and drive cost/performance improvements.

What you’ll bring (Required qualifications)

• 8+ years of experience in solutions/enterprise architecture or senior data engineering roles; 3+ years of hands-on experience with the Databricks platform and Spark-based architectures.

• Deep expertise in Databricks components: Delta Lake, Unity Catalog, Databricks SQL, Delta Live Tables, notebooks, and orchestration patterns.

• Strong cloud experience (AWS, Azure, or GCP) with data storage and compute services (e.g., S3/Blob, ADLS, GCS, Redshift, BigQuery, Synapse, EMR/Databricks on cloud).

• Proficiency in data integration and orchestration tools (e.g., Apache Airflow, dbt, Kafka, Spark Structured Streaming).

• Advanced SQL and programming skills (Python or Scala); ability to prototype and review data pipelines, models, and analytics solutions.

• Excellent communication and stakeholder management skills; ability to present complex technical concepts to both technical and non-technical audiences.

• Experience delivering large-scale data lakehouse migrations/transformations, performance tuning, and cost optimization.

• Databricks certification(s) or equivalent demonstrable expertise; willingness to obtain relevant certifications if not already held.

Preferred qualifications

• Experience with ML and MLOps on Databricks (MLflow, feature stores, model registry, CI/CD for ML).

• Domain expertise in industries such as financial services, healthcare, retail, or telecommunications.

• Familiarity with data governance, privacy regulations, and security frameworks (e.g., GDPR, HIPAA, SOC 2).

• Familiarity with real-time data processing and streaming architectures.

• Prior experience in pre-sales or solutioning for customers, including building compelling ROI stories and technical demos.

About you (soft skills and capabilities)

• Strategic thinker with a hands-on mindset; comfortable operating at both business and technical levels.

• Strong analytical, problem-solving, and decision-making capabilities.

• Collaborative team player who can lead without authority and influence stakeholders.

• Comfortable working in a fast-paced, client-facing environment with travel as needed.

• Submit your resume/CV and a brief cover letter outlining your Databricks projects and impact.

• Include links to relevant work (e.g., public case studies, GitHub repositories, or portfolio demos) if available.


Submit Your Application

You have successfully applied
  • You have errors in applying