Home
Jobs

Posted:5 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

We are seeking a highly skilled and experienced Data Engineer to join our team. The ideal candidate will have strong expertise in Python, SQL and PySpark, with proven experience working on Databricks and cloud platforms such as Azure and AWS. A solid understanding of ETL tools like Python as well as basic knowledge of DevOps practices and CI/CD pipelines, will be advantageous. This is a unique opportunity to work in a dynamic and fast-paced environment to design and implement robust data solutions for scalable business needs. Working with Git and versioning.

Key Responsibilities:

Data Pipeline Development:

  • Design, build, and optimize ETL/ELT workflows using tools like Databricks, SQL, Python/pyspark & Alteryx (Good to have).
  • Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets. from source to emerging data.

Cloud Data Engineering:

  • Work on cloud platforms (Azure, AWS) to build and manage data lakes, data warehouses, and scalable data architectures.
  • Utilize cloud services like Azure Data Factory, AWS Glue, or for data processing and orchestration.

Databricks and Big Data Solutions:

  • Use Databricks for big data processing, analytics, and real-time data processing.
  • Leverage Apache Spark for distributed computing and handling complex data transformations.

Data Management:

  • Create and manage SQL-based data solutions, ensuring high availability, scalability, and performance.
  • Develop and enforce data quality checks and validation mechanisms.

Collaboration and Stakeholder Engagement:

  • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to deliver impactful data solutions.
  • Understand business requirements and translate them into technical solutions.

DevOps and CI/CD:

  • Leverage CI/CD pipelines to streamline development, testing, and deployment of data engineering workflows.
  • Work with DevOps tools like Git, Jenkins, or Azure DevOps for version control and automation.

Documentation and Optimization:

  • Maintain clear documentation for data workflows, pipelines, and processes.
  • Optimize data systems for performance, scalability, and cost-efficiency

Required Qualifications, Experience and Skills

Educational Qualifications:

  • Bachelor s or Master s degree in Computer Science, Information Technology, Engineering, or a related field.

Experience:

  • 3 6 years of experience in Data Engineering or related roles.
  • Hands-on experience with big data processing frameworks, data lakes, and cloud-native services.

Skills:

Core Skills:

  • Proficiency in Python, SQL, and PySpark for data processing and manipulation.
  • Proven experience in Databricks and Apache Spark.
  • Expertise in working with cloud platforms like Azure, AWS.
  • Sound knowledge of ETL processes and tools like Alteryx.

    (Good to have)


Data Engineering Expertise:

  • Leveraging data lakes, data warehouses, and data pipelines.
  • Data Pipeline Build a Data Pipeline from scratch
  • Strong understanding of distributed systems and big data technologies.

DevOps and CI/CD:

  • Basic understanding of DevOps principles and familiarity with CI/CD pipelines.
  • Hands-on experience with tools like Git, Jenkins, or Azure DevOps.

Additional Skills:

  • Familiarity with data visualization tools like Power BI, Tableau, or similar is a plus.
  • Knowledge of streaming technologies such as Kafka or Event Hubs is desirable.
  • Strong problem-solving skills and a knack for optimizing data solutions.
  • Excellent communication (oral and written) skills

Mock Interview

Practice Video Interview with JobPe AI

Start Computer Science Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
NPS Prism
NPS Prism

Software Development

Boston Massachusetts

201-500 Employees

9 Jobs

    Key People

  • John Doe

    CEO
  • Jane Smith

    CTO

RecommendedJobs for You

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru