Data Engineer - Senior Associate/Manager

3-6 years, Full Time
Mumbai/Bangalore

Role Summary

AuxoAI is seeking a skilled and experienced Data Engineer to join our dynamic team. The idealcandidate will have 3- 6 years of prior experience in data engineering, with a strong background in AWS (Amazon Web Services) technologies. This role offers an excitingopportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimizedata pipelines and infrastructure.

Additionally, you are required to stay up to date on AI trends and best practices, sharing knowledge with the team to foster continuous learning and improvement.

If you are proactive, self-motivated, and passionate about AI, this role offers an exciting opportunity to make a meaningful impactand drive businesstransformation.


Responsibilities

  1. Design, develop, and maintain scalable data pipelines and ETL processes leveraging AWS services such as S3, Glue,EMR, Lambda, Aurora,RDS, Lake formation, Athena, DMS and Redshift.
  2. Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learninginitiatives.
  3. Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness.
  4. Implement data governance and security best practices to ensure compliance and data integrity.
  5. Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring.
  6. Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.
  7. Develop and maintain strong relationships with key clients, serving as a trusted advisor and strategic partnerwhile identifying opportunities for upselling and cross-selling additional services to drive revenue growth.


Qualifications

  1. Bachelor's or Master's degree in Computer Science,Engineering, or a related field.
  2. 3-6 years of prior experience in data engineering, with a focus on designingand building data pipelines.
  3. Proficiency in AWS services, particularly S3, Glue, EMR, Lambda, Aurora, RDS, MWAA, Lake formation, Athena, DMS and Redshift.
  4. Strong programming skills in languages such as Python, Java,or Scala.
  5. Proficient in Spark, Databricks and messaging queues like RabbitMQ and Kafka.
  6. Experience with SQL and NoSQL databases, data warehousing concepts, and big data technologies.
  7. Familiarity with containerization technologies (e.g., Docker, Kubernetes) and orchestration tools (e.g., Apache Airflow).