Development

Data Engineering Manager (Remote)

Remote
Work Type: Full Time

Company Overview:

Codvo.ai is a leading technology-driven organization that specializes in building innovative and cutting-edge AI-powered and Cloud First applications. We are dedicated to pushing the boundaries of technology to deliver exceptional solutions that transform industries and improve lives.

Roles & Responsibilities: 

  1. Team Leadership & Mentorship: 

  • Manage and lead a team of data engineers, providing technical guidance, mentorship, and support. 

  • Foster a collaborative and high-performing team environment. 

  • Conduct performance reviews, identify training needs, and support the professional development of team members. 

  1. Data Engineering Project Execution: 

  • Oversee the successful delivery of data engineering projects, ensuring alignment with business requirements and timelines. 

  • Collaborate with project managers and stakeholders to define project scope, allocate resources, and track progress. 

  • Identify and mitigate project risks and dependencies. 

  1. Data Pipeline Development & Maintenance: 

  • Contribute to the design, development, and maintenance of scalable and efficient data pipelines. 

  • Ensure data quality, integrity, and consistency throughout the data lifecycle. 

  1. Data Infrastructure & Tooling: 

  • Contribute to the design and implementation of robust and scalable data infrastructure using cloud-based technologies (AWS, Azure, GCP). 

  • Evaluate and recommend appropriate data engineering tools and technologies.  

  1. Collaboration & Communication: 

  • Work closely with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions.

Requirements: 

  1. Bachelor's or master's degree in computer science, Engineering, or a related field. 

  1. 7+ years of experience in DE, with at least 3+ years in a managerial. 

  1. Strong understanding of data warehousing concepts, data modeling techniques, and ETL processes. 

  1. Experience with building and maintaining data pipelines using tools like Apache Spark, Apache Kafka, or similar technologies. 

  1. Proficiency in SQL ,Python, Java, or Scala. 

  1. Experience with  (AWS, Azure, GCP).

Experience: 10+ Years 

Job Location: Remote 

Work Timings: 2.30 pm - 11.30 pm IST 

Sub-Department:
Engineering Manager
 

Submit Your Application

You have successfully applied
  • You have errors in applying