Posted:1 day ago|
Platform:
On-site
Part Time
Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Outcomes:
Measures of Outcomes:
Outputs Expected:
Code:
Documentation:
Configure:
Test:
Domain Relevance:
Manage Project:
Manage Defects:
Estimate:
Manage Knowledge:
Release:
Design:
Interface with Customer:
Manage Team:
Certifications:
Skill Examples:
Knowledge Examples:
Knowledge Examples
Additional Comments:
We are seeking a highly experienced Senior Data Engineer to design, develop, and optimize scalable data pipelines in a cloud-based environment. The ideal candidate will have deep expertise in PySpark, SQL, Azure Databricks, and experience with either AWS or GCP. A strong foundation in data warehousing, ELT/ETL processes, and dimensional modeling (Kimball/star schema) is essential for this role. Must-Have Skills 8+ years of hands-on experience in data engineering or big data development. Strong proficiency in PySpark and SQL for data transformation and pipeline development. Experience working in Azure Databricks or equivalent Spark-based cloud platforms. Practical knowledge of cloud data environments – Azure, AWS, or GCP. Solid understanding of data warehousing concepts, including Kimball methodology and star/snowflake schema design. Proven experience designing and maintaining ETL/ELT pipelines in production. Familiarity with version control (e.g., Git), CI/CD practices, and data pipeline orchestration tools (e.g., Airflow, Azure Data Factory
Azure Data Factory,Azure Databricks,Pyspark,Sql
UST Global
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowThiruvananthapuram
8.0 - 9.15 Lacs P.A.
Trivandrum, Kerala, India
Salary: Not disclosed
pune, maharashtra
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed
Thiruvananthapuram
8.0 - 9.6 Lacs P.A.
Trivandrum, Kerala, India
Experience: Not specified
Salary: Not disclosed
Thiruvananthapuram
8.0 - 9.15 Lacs P.A.
Trivandrum, Kerala, India
Salary: Not disclosed
pune, maharashtra
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed