Product Owner – Data & BI Solutions
Experience: 6–8 years | Function: Product & Solutioning | Location: Remote
Role Summary
Own the vision, roadmap, and delivery of enterprise-grade data platforms and BI experiences - from ingestion and MDM to lakehouse analytics and executive dashboards. You’ll turn fragmented data into trusted, governed, and actionable products that improve decisions and time-to-insight across the enterprise.
Key Responsibilities
1. Define Vision & Roadmap
• Set a problem-first roadmap for lakehouse, semantic layer, and BI experiences aligned to priority business journeys & projects.
• Prioritize domains with value hypotheses (revenue, cost, risk) and ROI models.
2. UAT first development
• Map source systems, data contracts, golden records, and reporting needs with stakeholders.
• Author detailed UAT document for each feature with detailed acceptance criteria, test cases and evaluation metrics.
3. Product Leadership
• Drive backlog, sprints, and releases with data engineering, platform teams, and BI developers.
• Manage dependencies: data readiness, reference/master data, identity resolution, and security.
4. Measure, Learn, Iterate
• Track guardrails: freshness, completeness, accuracy, lineage coverage, cost-to-serve.
• Improve models/marts via A/B on dashboard usage, query latency, and decision-cycle metrics.
5. Governance & Compliance
• Implement MDM policies (match/merge, survivorship), ETL/ELT standards, data cataloging, and access control.
• Partner with Security/Legal for PII/PHI handling, data residency, SOC2/ISO alignment.
6. Bridge Business & Technology
• Translate schemas and platform constraints into business KPIs, SLAs, and ROI; publish runbooks, change logs, and release notes.
• Enable self-service (certified datasets, semantic models, data literacy artifacts).
Must-Have Qualifications
• 2+ years directly owning delivery of data platforms/BI initiatives (enterprise lakehouse, marts, or domain data products).
• Hands-on product ownership with at least one modern data platform: Databricks, Snowflake, Microsoft Fabric/Azure, Cognite Data Fusion (or similar).
• Fluency in MDM concepts (golden record, hierarchy mgmt, survivorship), ETL/ELT pipelines, and data governance (catalog, lineage, data quality rules, access policies).
• Strong agile execution with Jira/Confluence/Azure DevOps; ability to write testable UAT documents & User stories with clear SLAs/DQ checks.
• Strong academic background + MBA (Tier-1) engineering/CS preferred.
Nice to Have
• Experience with semantic layers/metrics stores (e.g., Power BI models, dbt metrics, LookML-style), dbt, Spark/Delta, or Airflow/Databricks Workflows.
• Exposure to event streaming (Kafka), CDC (Debezium/Fivetran), and cost/performance tuning (caching, clustering, query acceleration).
• Familiarity with data contracts, domain-driven design for data, and data product operating models.
Familiarity with Skills & Tools
• Data Platform: Databricks (Delta, Unity Catalog), Snowflake (Tasks/Streams), Microsoft Fabric/Azure Synapse; Cognite DF a plus.
• Pipelines & Modeling: SQL, basic Python literacy, ELT (dbt), orchestration (Airflow/Workflows), DQ (Great Expectations/expectations via dbt tests).
• BI & Semantics: Power BI (DAX, modeling, RLS), Tableau (LOD, governance), certified datasets, usage telemetry.
• Collab: Jira/Confluence/Azure DevOps, Lucidchart/Draw.io, PPT/Excel/Word, Git for versioning.
About Us
At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.