Let's Talk
View All Jobs

Python Expert with AWS Data Engineering

ChennaiDataFull-Time

We are seeking a Python Data Engineering expert with 8+ years of overall experience in Data Engineering and at least 5+ years of hands-on expertise in Python development. The ideal candidate should have strong proficiency in Python, RDBMS, and a solid understanding of AWS services.

Key Responsibilities:

  • Python Expertise: Leverage 5+ years of hands-on experience with Python for developing scalable, high-performance data engineering solutions.
  • RDBMS to NoSQL Conversion: Lead the effort in migrating large data sets from RDBMS to NoSQL databases, ensuring seamless and efficient data storage and retrieval.
  • Data Lake & Delta Tables: Design and build data lakes and manage configurations on Delta tables to optimize data processing and storage workflows.
  • Cost Optimization: Implement computing strategies that minimize costs while maximizing data processing efficiency within cloud environments like AWS.
  • Holistic Framework Development: Analyze current environments and use cases to develop holistic data engineering frameworks that address both short-term and long-term business needs.
  • Airflow Framework: Utilize Apache Airflow to manage, schedule, and monitor workflows, ensuring data pipelines are running efficiently.
  • AWS Service Integration: Work extensively with AWS services such as EMR, S3, Lambda, Step Functions, and Aurora (RDS) to build, manage, and optimize data solutions on the cloud.

Required Skills:

  • 8+ Years of Data Engineering Experience: A strong background in Data Engineering, with 5+ years of Python development experience.
  • RDBMS Expertise: Proficiency in RDBMS and SQL for managing large-scale relational databases.
  • NoSQL: Familiarity with converting data from RDBMS to NoSQL databases to enable flexible, scalable data storage solutions.
  • Data Lakes & Delta Tables: Experience in building data lakes and working with Delta table configurations for optimized data storage and processing.
  • AWS Services: Hands-on experience with AWS technologies such as EMR, S3, Lambda, Step Functions, and Aurora (RDS) for cloud-based data solutions.
  • Apache Airflow: Strong understanding of Apache Airflow to automate, schedule, and manage data workflows.

Soft Skills:

  • Team Collaboration: Demonstrated ability to work as part of a team, with a positive attitude and commitment to collaborative problem-solving.

Submit your Application

#WeAreHereForYou What can we develop together? Let's Talk
We are located in India and USA

The Hive Workspaces, Keppel One Paramount, Campus 30, Level 9,
No. 110, Mount Poonamallee Road, Porur, Chennai, Tamil Nadu – 600116

4701 Patrick Henry Drive.
Building 3, Santa Clara, CA- 95054, USA