Jobs
Interviews

5 Dags Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Develop and optimize data processing jobs using PySpark to handle complex data transformations and aggregations efficiently. Design and implement robust data pipelines on the AWS platform, ensuring scalability and efficiency (Databricks exposure will be an advantage) Leverage AWS services such as EC2, S3, etc. for comprehensive data processing and storage solutions. Expertly manage SQL database schema design, query optimization, and performance tuning to support data transformation and loading processes. Design and maintain scalable and performant data warehouses, employing best practices in data modeling and ETL processes. Utilize modern data platforms for collaborative data science, integrating seamlessly with various data sources and types. Ensure high data quality and accessibility by maintaining optimal performance of Databricks clusters and Spark jobs. Develop and implement security measures, backup procedures, and disaster recovery plans using AWS best practices. Manage source code and automate deployment using GitHub along with CI/CD practices tailored for data operations in cloud environments. Provide expertise in troubleshooting and optimizing PySpark scripts, Databricks notebooks, SQL queries, and Airflow DAGs. Keep abreast of latest developments in cloud data technologies and advocate for the adoption of new tools and practices that can benefit the team. Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and reliable execution of data jobs across various data sources and systems. Collaborate closely with data scientists and business analysts to design data models and pipelines that support advanced analytics and machine learning projects.

Posted 5 days ago

Apply

6.0 - 10.0 years

10 - 20 Lacs

Pune

Work from Office

Job Description: Job Role: Data Engineer Role Yrs of Exp : 6+Years Job Location : Pune Work Model : Hybrid Job Summary: We are seeking a highly skilled Data Engineer with strong expertise in DBT, Java, Apache Airflow, and DAG (Directed Acyclic Graph) design to join our data platform team. You will be responsible for building robust data pipelines, designing and managing workflow DAGs, and ensuring scalable data transformations to support analytics and business intelligence. Key Responsibilities: Design, implement, and optimize ETL/ELT pipelines using DBT for data modeling and transformation. Develop backend components and data processing logic using Java. Build and maintain DAGs in Apache Airflow for orchestration and automation of data workflows. Ensure the reliability, scalability, and efficiency of data pipelines for ingestion, transformation, and storage. Work with cross-functional teams to understand data needs and deliver high-quality solutions. Troubleshoot and resolve data pipeline issues in production environments. Apply data quality and governance best practices, including validation, logging, and monitoring. Collaborate on CI/CD deployment pipelines for data infrastructure. Required Skills & Qualifications: 4+ years of hands-on experience in Data engineering roles. Strong experience with DBT for modular, testable, and version-controlled data transformation. Proficient in Java , especially for building custom data connectors or processing frameworks. Deep understanding of Apache Airflow and ability to design and manage complex DAGs. Solid SQL skills and familiarity with data warehouse platforms (e.g., Snowflake, Redshift, BigQuery). Familiarity with version control tools (Git), CI/CD pipelines, and Agile methodologies. Exposure to cloud environments like AWS, GCP, or Azure .

Posted 5 days ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Remote

Senior Data Developer with Strong MS/Oracle SQL, Python Skills and Critical Thinking Description: The EDA team seeks a dedicated and detail-oriented Senior Developer I to join our dynamic team. The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes. Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors. The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, and other relevant technologies. Overview : As a Senior Developer I at Useready team, you will play a vital role to improve the operation of our data load and management processes. Your primary responsibilities will be to ensure the accuracy and timeliness of data loads, maintain the health of data pipelines, and monitor that all scheduled jobs are completed successfully. You will collaborate with cross-functional teams to identify and resolve issues, improve processes, and maintain a high standard of data integrity. Responsibilities: Manage and perform Healthy Planet file loads into a data warehouse. Monitor Airflow DAGs for successful completion, manage alerts, and rerun failed tasks as necessary. Monitor and oversee other daily and weekly jobs, including FGP cash reports and external reports. Collaborate with the data engineering team to streamline data processing workflows. Develop automation scripts to reduce manual intervention in repetitive tasks using SQL and Python. Ensure all data-related tasks are performed accurately and on time. Investigate and resolve data discrepancies and processing issues. Prepare and maintain documentation for processes and workflows. Conduct periodic data audits to ensure data integrity and compliance with defined standards. Skillset Requirements: MS/Oracle SQL Python Data warehousing and ETL processes Monitoring tools such as Apache Airflow Data quality and integrity assurance Strong analytical and problem-solving abilities Excellent written and verbal communication Additional Skillset Familiarity with monitoring and managing Apache Airflow DAGs. Experience: Minimum of 5 years experience in a similar role, with a focus on data management and process automation. Proven track record of successfully managing complex data processes and meeting deadlines. Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Certifications: Epic Cogito MS/Oracle SQL, Python, or data management are a plus. Notice period - 0-30 days

Posted 1 week ago

Apply

5.0 - 9.0 years

6 - 10 Lacs

Pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Pune

Work from Office

hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies