Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
0 - 0 Lacs
chennai
Remote
Job Title: Data Engineer PySpark & AWS Location: Chennai Employment Type: Full-Time with Artech Experience Level: 4-10 years About the Role: We are seeking a highly skilled Data Engineer with strong expertise in PySpark and AWS to join our growing data team. In this role, you will be responsible for building, optimizing, and maintaining data pipelines and ETL workflows on the cloud, enabling large-scale data processing and analytics. You will work closely with data scientists, analysts, and business stakeholders to ensure data is accessible, accurate, and reliable for advanced analytics and reporting. Key Responsibilities: Design, build, and maintain scalable and efficient data pipelines using PySpark and Apache Spark . Develop and manage ETL/ELT workflows to ingest data from multiple structured and unstructured sources. Implement data transformation, cleansing, validation, and aggregation logic. Work with AWS cloud services such as S3, Glue, EMR, Lambda, Redshift, Athena , and CloudWatch . Monitor data pipelines for performance, reliability, and data quality . Collaborate with cross-functional teams to understand business data needs and translate them into technical solutions. Automate data engineering tasks and infrastructure using tools like Terraform or CloudFormation (optional). Maintain and document data architecture, job logic, and operational processes. Required Skills: 4+ years of experience as a Data Engineer or in a similar role. Strong hands-on experience with PySpark and Apache Spark for distributed data processing. Proficiency in Python programming for data manipulation and automation. Solid understanding of AWS services for data engineering: S3, Glue, EMR, Redshift, Lambda, Athena, CloudWatch Experience with SQL and relational databases (e.g., PostgreSQL, MySQL). Knowledge of data modeling, warehousing, and partitioning strategies . Experience with version control (Git) and CI/CD practices . Nice to Have: Experience with workflow orchestration tools (e.g., Airflow, Step Functions). Familiarity with Docker/Kubernetes for containerized deployments. Exposure to NoSQL databases (DynamoDB, MongoDB). Experience with Terraform or CloudFormation for infrastructure automation. Knowledge of Delta Lake and data lake architecture best practices. Educational Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Engineering , or a related field.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough