Posted:1 day ago|
Platform:
Work from Office
Full Time
Key Responsibilities : Architect and implement end-to-end data solutions using Azure services (Data Factory, Databricks, Data Lake, Synapse, Cosmos DB, etc.). Design robust and scalable data models, including relational, dimensional, and NoSQL schemas. Develop and optimize ETL/ELT pipelines and data lakes using Azure Data Factory, Databricks, and open formats such as Delta and Iceberg. Integrate data governance, quality, and security best practices into all architecture designs. Support analytics and machine learning initiatives through structured data pipelines and platforms. Perform data manipulation and analysis using Pandas, NumPy, and related Python libraries Develop and maintain high-performance REST APIs using FastAPI or Flas Ensure data integrity, quality, and availability across various sources Integrate data workflows with application components to support real-time or scheduled processes Collaborate with data engineers, analysts, data scientists, and business stakeholders to align solutions with business needs. Drive CI/CD integration with Databricks using Azure DevOps and tools like DBT. Monitor system performance, troubleshoot issues, and optimize data infrastructure for efficiency and reliability. Communicate technical concepts effectively to non-technical stakeholders Required Skills & Experience Extensive hands-on experience with Azure services: Data Factory, Databricks, Data Lake, Azure SQL, Cosmos DB, Synapse. Expertise in data modeling and design (relational, dimensional, NoSQL). Proven experience with ETL/ELT processes, data lakes, and modern lake house architectures. Solid experience with SQL Server or any major RDBMS; ability to write complex queries and stored procedures 3+ years of experience with Azure Data Factory, Azure Databricks, and PySpark Strong programming skills in Python, with solid understanding of Pandas and NumPy Proven experience in building REST APIs Good knowledge of data formats (JSON, Parquet, Avro) and API communication patterns Strong knowledge of data governance, security, and compliance frameworks. Experience with CI/CD, Azure DevOps, and infrastructure as code (Terraform or ARM templates). Familiarity with BI and analytics tools such as Power BI or Tableau. Strong problem-solving skills and attention to performance, scalability, and security Excellent communication skills both written and verbal Preferred Qualifications • Experience in regulated industries (finance, healthcare, etc.). • Familiarity with data cataloging, metadata management, and machine learning integration. • Leadership experience guiding teams and presenting architectural strategies to leadership.
Geak Minds
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now7.5 - 17.5 Lacs P.A.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
4.0 - 6.0 Lacs P.A.
9.0 - 14.0 Lacs P.A.
Hyderabad
4.0 - 8.0 Lacs P.A.
Gurugram
5.0 - 9.0 Lacs P.A.
Bengaluru
14.0 - 18.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
Pune, Chennai, Bengaluru
0.5 - 3.0 Lacs P.A.
Bengaluru
4.0 - 8.0 Lacs P.A.
15.0 - 19.0 Lacs P.A.