Job
Description
About The Role
Project Role :Data Engineer
Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills :Microsoft Azure Databricks
Good to have skills :Python (Programming Language), Microsoft Azure Data Services, PySparkMinimum
3 year(s) of experience is required
Educational Qualification :15 years full time education
Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. You will also monitor and optimize existing data processes to enhance performance and reliability, making data-driven decisions to support organizational goals.
Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze data requirements.- Design and implement data models that support business needs.Must have Proficiency in programming languages such as Python, SQL, and experience with big data technology - Spark.Experience with cloud platforms mainly on Microsoft Azure.Experience on Microsoft Azure Databricks and Azure Data Factory.Experience with CI/CD processes and tools, including Azure DevOps, Jenkins, and Git, to ensure smooth and efficient deployment of data solutions. Familiarity with APIs to push and Pull data from data systems and Platforms. Familiarity with understanding software architecture High level Design document and translating them to developmental tasks. Familiarity with Microsoft data stack such as Azure Data Factory, Azure Synapse, Databricks, Azure DevOps and Fabric / PowerBI. Nice to have:Experience with machine learning and AI technologies Data Modelling & Architecture ETL pipeline design Azure DevOps Logging and Monitoring using Azure / Databricks services Apache KafkaAdditional Information:- A 15 years full time education is required. Qualification
15 years full time education