Data Engineering Consultant (Category - Architect)
Sector: Oil and Gas
Location: Doha, Qatar
About Us
At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.
Job Description:
Key Responsibilities:
Enterprise Data Architecture & Design:
• Define and maintain the enterprise data architecture blueprint, including data lakes, warehouses, streaming platforms, and analytics layers.
• Design logical, physical, and conceptual data models to support operational, analytical, and AI use cases.
• Ensure architecture supports scalability, performance, and high availability.
• Architect & Build Data Pipelines: Design, construct, install, test, and maintain highly scalable data management systems and ETL/ELT pipelines.
• Integrate Diverse Data Sources: Develop processes to ingest and integrate high-volume, high-velocity data from SCADA systems, historians (like OSIsoft PI, Aspen InfoPlus.21), DCS, PLC, and IoT sensors.
• Cloud Data Platform Development: Implement and manage data solutions on the Microsoft Azure cloud platform, Leveraging services like Azure IoT Hub, Azure Event Hubs, and Azure Stream Analytics for real-time ingestion and processing of operational technology (OT) data.
• Data Modelling & Warehousing: Design and implement data models optimized for time-series data from industrial assets, supporting operational dashboards and real-time analytics.
• Enable Advanced AI: Build the data infrastructure to support AI/ML models for predictive maintenance, operational anomaly detection, and process optimization using real-time OT data.
Required Skills and Qualifications:
• 10+ years of experience in data architecture, data engineering, or enterprise analytics roles.
• Strong experience designing enterprise-scale data platforms in cloud and hybrid environments.
• Expertise in data modelling (conceptual, logical, physical) and data integration patterns.
• Deep understanding of data lakes, data warehouses, Lakehouse architectures, and real-time streaming platforms.
• Hands-on experience with cloud data services (Azure preferred).
• Strong knowledge of SQL, data transformation frameworks, and metadata management.
• Familiarity with DevOps and DataOps practices.
• Strong communication skills with the ability to engage both technical and business stakeholders.
Technical Proficiencies:
• Expert-level proficiency in SQL and Python for data manipulation and pipeline development.
• Big Data Technologies: Hands-on experience with distributed computing frameworks like Apache Spark (PySpark). Experience with streaming technologies like Kafka is a plus.
• Cloud Platforms: Deep experience with Microsoft Azure (Azure Data Lake Storage, Azure Data Factory, Azure Databricks, Azure Synapse).
• Data Warehousing/Lakehouse: Proven experience with modern data platforms Databricks Delta Lake.
• AI & ML Knowledge: Understanding of machine learning lifecycles and the data requirements for training and deploying AI/ML models.
• Version Control: Proficiency with Git and CI/CD best practices.
Preferred Qualifications:
• Certifications such as Azure Data Engineer Expert, Azure Solutions Architect, or equivalent.
• Hands-on experience with the OSDU™ Data Platform.
• Experience in Oil & Gas, energy, or industrial data environments.
• Understanding of cybersecurity considerations for OT environments and data segregation.
• Experience integrating data from ERP systems like SAP.
• Experience working within AI-first or analytics-driven transformation programs.
• Advanced SQL skills, including query optimization and performance tuning.
• Experience of constructing and maintaining enterprise knowledge graphs.
Note- Please apply via our official careers portal only, as applications sent directly to executives may not be considered.