Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Job Description Note***: This is a hybrid role, combining remote and on-site work, requiring 3 days in the office, and relocation to Pune. Our Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join our growing data engineering team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk . You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform. What You Get To Do Every Single Day Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quali ty testing Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems Basic Qualifications What you bring to the role: 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments 5+ years of experience in Data Modeling and Data Architecture in a production environment 5+ years in writing complex SQL queries 5+ years of experience with Cloud columnar databases (Snowflake, Google BigQuery, Amazon Redshift) 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams. Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualifications Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Demonstrated experience in one or many business domains (Finance, Sales, Marketing) 3+ completed projects with dbt Expert knowledge in python Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The Intelligent Heart Of Customer Experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.
Posted 22 hours ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Job Description Note***: This is a hybrid role, combining remote and on-site work, requiring 3 days in the office, and relocation to Pune. Our Enterprise Data & Analytics (EDA) is seeking an experienced Senior Data Engineer to join our growing data engineering team. You’ll work in a collaborative Agile environment using the latest engineering best practices and be involved in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk. You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, and Terraform. What You Get To Do Every Single Day Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems Basic Qualifications What you bring to the role: 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments 5+ years of experience in Data Modeling and Data Architecture in a production environment 5+ years in writing complex SQL queries 5+ years of experience with Cloud columnar databases (Snowflake, Google BigQuery, Amazon Redshift) 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams. Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualifications Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Demonstrated experience in one or many business domains (Finance, Sales, Marketing) 3+ completed projects with dbt Expert knowledge in python Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The Intelligent Heart Of Customer Experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.
Posted 22 hours ago
5.0 - 8.0 years
15 - 22 Lacs
Pune, Chennai, Bengaluru
Hybrid
Experienced in Big Data (Spark Scala, Kafka), strong in SQL, ETL (Talend preferred), data processing (batch/real-time), architecture patterns, and debugging. Knowledge of Hive, NoSQL, Java, Linux. Python/R good to have. Required Candidate profile Required: Spark Scala, Kafka, SQL, Talend (or any ETL), Hive/Impala, NoSQL, Starburst, Java (optional), Spring (optional), Linux (user level). Python or R is good to have but not mandatory.
Posted 22 hours ago
2.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Description for Internal Candidates The Risk division is responsible for credit, market and operational risk, model risk, independent liquidity risk, and insurance throughout the firm. The Goldman Sachs Group, Inc. is a leading global investment banking, securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations, financial institutions, governments, and individuals. Founded in 1869, the firm is headquartered in New York and maintains offices in all major financial centers around the world. We commit people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Our people are our greatest asset – we say it often and with good reason. It is only with the determination and dedication of our people that we can serve our clients, generate long-term value for our shareholders and contribute to the broader public. We take pride in supporting each colleague both professionally and personally. From collaborative workspaces and ergonomic services to wellbeing and resilience offerings, we offer our people the flexibility and support they need to reach their goals in and outside the office. RISK BUSINESS The Risk Business identifies, monitors, evaluates, and manages the firm’s financial and non-financial risks in support of the firm’s Risk Appetite Statement and the firm’s strategic plan. Operating in a fast paced and dynamic environment and utilizing the best in class risk tools and frameworks, Risk teams are analytically curious, have an aptitude to challenge, and an unwavering commitment to excellence. Overview To ensure uncompromising accuracy and timeliness in the delivery of the risk metrics, our platform is continuously growing and evolving market, Risk Engineering combines the principles of Computer Science, Mathematics and Finance to produce large scale, computationally intensive calculations of risk Goldman Sachs faces with each transaction we engage in. Market Risk Engineering has an opportunity for an Associate level Software Engineer to work across a broad range of applications and extremely diverse set of technologies to keep the suite operating at peak efficiency. As an Engineer in the Risk Engineering organization, you will have the opportunity to impact one or more aspects of risk management. You will work with a team of talented engineers to drive the build & adoption of common tools, platforms, and applications. The team builds solutions that are offered as a software product or as a hosted service. We are a dynamic team of talented developers and architects who partner with business areas and other technology teams to deliver high profile projects using a raft of technologies that are fit for purpose (Java, Cloud computing, HDFS, Spark, S3, ReactJS, Sybase IQ among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. What We Look For Senior Developer in large projects across a global team of developers and risk managers Performance tune applications to improve memory and CPU utilization. Perform statistical analyses to identify trends and exceptions related Market Risk metrics. Build internal and external reporting for the output of risk metric calculation using data extraction tools, such as SQL, and data visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code review and production migrations. Skills And Experience Bachelor’s degree in Computer Science, Mathematics, Electrical Engineering or related technical discipline 2 -3years’ experience is working risk technology team in another bank, financial institution. Experience in market risk technology is a plus. Experience with one or more major relational / object databases. Experience in software development, including a clear understanding of data structures, algorithms, software design and core programming concepts Comfortable multi-tasking, managing multiple stakeholders and working as part of a team Comfortable with working with multiple languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Should be ready to work in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing – parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through entire life cycle of the project from start to end
Posted 22 hours ago
4.0 years
8 - 24 Lacs
Gurugram, Haryana, India
On-site
About The Opportunity We are a fast-growing provider of cloud data engineering services within the Information Technology & Services sector. Our teams build high-throughput data platforms on Amazon Web Services for Fortune 500 and digitally native clients across retail, fintech, and healthcare. Leveraging open-source big-data frameworks and modern DevOps practices, we turn raw, real-time streams into actionable insights that drive mission-critical decisions. Role & Responsibilities Design, build, and optimize secure AWS data lakes and lakehouses for petabyte-scale analytics. Develop Spark (Scala) ETL jobs on EMR/Glue, ensuring efficient ingestion, transformation, and partitioning. Implement automated data quality checks, schema validation, and performance tuning for sub-second query latency. Orchestrate pipelines with AWS Step Functions, Airflow, and CI/CD to enable reliable, repeatable deployments. Collaborate with Data Scientists to productionize machine-learning features and batch/stream jobs. Document architecture, promote engineering best practices, and mentor junior engineers on cloud-native patterns. Skills & Qualifications Must-Have 4+ years designing data solutions on AWS (S3, Glue, EMR, Redshift, Lake Formation). Expertise in Spark with Scala, including performance diagnostics and cluster optimization. Proven experience building end-to-end ETL/ELT pipelines and implementing data governance. Strong SQL skills and exposure to columnar file formats (Parquet, ORC). Hands-on with Terraform/CloudFormation, Git, and automated testing frameworks. Bachelor’s degree in Computer Science, Engineering, or equivalent. Preferred Real-time streaming with Kafka/Kinesis and Lambda. Experience migrating on-prem Hadoop workloads to AWS. Knowledge of Delta Lake, Iceberg, or Hudi table formats. Exposure to ML Ops and feature stores on SageMaker. Benefits & Culture Highlights On-site innovation lab equipped with the latest AWS tooling for rapid prototyping. Dedicated learning budget covering AWS certifications and industry conferences. Collaborative, high-trust culture that rewards thought leadership and measurable impact. Location: On-site, India — join us to architect the next generation of cloud-native data platforms and accelerate your career as a Senior AWS Data Engineer. Skills: aws step functions,sql,hudi,apache airflow,terraform,kafka,aws,aws data engineer (spark scala),iceberg,git,delta lake,kinesis,lambda,spark,scala,python,etl,cloudformation
Posted 1 day ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description: Job Summary: SSENTIAL SKILLS AND QUALIFICATIONS: Bachelor’s degree in computer science, Data Science, or a related field (Master’s preferred). Certifications (Preferred): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Professional Microsoft Certified: Power BI Data Analyst Associate 8+ years of experience in analytics, data integration, and reporting. 4+ years of hands-on experience with Databricks, including: Proficiency in Databricks Notebooks for development and testing. Advanced skills in Databricks SQL, Python, and/or Scala for data engineering. Expertise in cluster management, auto-scaling, and cost optimization. 4+ years of expertise with Power BI, including: Advanced DAX for building measures and calculated fields. Proficiency in Power Query for data transformation. Deep understanding of Power BI architecture, workspaces, and row-level security. Strong knowledge of SQL for querying, aggregations, and optimization. Experience with modern ETL/ELT tools such as Azure Data Factory, Informatica, or Talend. Proficiency in Azure cloud platforms and their application to analytics solutions. Strong analytical thinking with the ability to translate data into actionable insights. Excellent communication skills to effectively collaborate with technical and non-technical stakeholders. Ability to manage multiple priorities in a fast-paced environment with high customer expectations.
Posted 1 day ago
4.0 years
8 - 24 Lacs
Chennai, Tamil Nadu, India
On-site
About The Opportunity We are a fast-growing provider of cloud data engineering services within the Information Technology & Services sector. Our teams build high-throughput data platforms on Amazon Web Services for Fortune 500 and digitally native clients across retail, fintech, and healthcare. Leveraging open-source big-data frameworks and modern DevOps practices, we turn raw, real-time streams into actionable insights that drive mission-critical decisions. Role & Responsibilities Design, build, and optimize secure AWS data lakes and lakehouses for petabyte-scale analytics. Develop Spark (Scala) ETL jobs on EMR/Glue, ensuring efficient ingestion, transformation, and partitioning. Implement automated data quality checks, schema validation, and performance tuning for sub-second query latency. Orchestrate pipelines with AWS Step Functions, Airflow, and CI/CD to enable reliable, repeatable deployments. Collaborate with Data Scientists to productionize machine-learning features and batch/stream jobs. Document architecture, promote engineering best practices, and mentor junior engineers on cloud-native patterns. Skills & Qualifications Must-Have 4+ years designing data solutions on AWS (S3, Glue, EMR, Redshift, Lake Formation). Expertise in Spark with Scala, including performance diagnostics and cluster optimization. Proven experience building end-to-end ETL/ELT pipelines and implementing data governance. Strong SQL skills and exposure to columnar file formats (Parquet, ORC). Hands-on with Terraform/CloudFormation, Git, and automated testing frameworks. Bachelor’s degree in Computer Science, Engineering, or equivalent. Preferred Real-time streaming with Kafka/Kinesis and Lambda. Experience migrating on-prem Hadoop workloads to AWS. Knowledge of Delta Lake, Iceberg, or Hudi table formats. Exposure to ML Ops and feature stores on SageMaker. Benefits & Culture Highlights On-site innovation lab equipped with the latest AWS tooling for rapid prototyping. Dedicated learning budget covering AWS certifications and industry conferences. Collaborative, high-trust culture that rewards thought leadership and measurable impact. Location: On-site, India — join us to architect the next generation of cloud-native data platforms and accelerate your career as a Senior AWS Data Engineer. Skills: aws step functions,sql,hudi,apache airflow,terraform,kafka,aws,aws data engineer (spark scala),iceberg,git,delta lake,kinesis,lambda,spark,scala,python,etl,cloudformation
Posted 1 day ago
4.0 years
8 - 24 Lacs
Bengaluru, Karnataka, India
On-site
About The Opportunity We are a fast-growing provider of cloud data engineering services within the Information Technology & Services sector. Our teams build high-throughput data platforms on Amazon Web Services for Fortune 500 and digitally native clients across retail, fintech, and healthcare. Leveraging open-source big-data frameworks and modern DevOps practices, we turn raw, real-time streams into actionable insights that drive mission-critical decisions. Role & Responsibilities Design, build, and optimize secure AWS data lakes and lakehouses for petabyte-scale analytics. Develop Spark (Scala) ETL jobs on EMR/Glue, ensuring efficient ingestion, transformation, and partitioning. Implement automated data quality checks, schema validation, and performance tuning for sub-second query latency. Orchestrate pipelines with AWS Step Functions, Airflow, and CI/CD to enable reliable, repeatable deployments. Collaborate with Data Scientists to productionize machine-learning features and batch/stream jobs. Document architecture, promote engineering best practices, and mentor junior engineers on cloud-native patterns. Skills & Qualifications Must-Have 4+ years designing data solutions on AWS (S3, Glue, EMR, Redshift, Lake Formation). Expertise in Spark with Scala, including performance diagnostics and cluster optimization. Proven experience building end-to-end ETL/ELT pipelines and implementing data governance. Strong SQL skills and exposure to columnar file formats (Parquet, ORC). Hands-on with Terraform/CloudFormation, Git, and automated testing frameworks. Bachelor’s degree in Computer Science, Engineering, or equivalent. Preferred Real-time streaming with Kafka/Kinesis and Lambda. Experience migrating on-prem Hadoop workloads to AWS. Knowledge of Delta Lake, Iceberg, or Hudi table formats. Exposure to ML Ops and feature stores on SageMaker. Benefits & Culture Highlights On-site innovation lab equipped with the latest AWS tooling for rapid prototyping. Dedicated learning budget covering AWS certifications and industry conferences. Collaborative, high-trust culture that rewards thought leadership and measurable impact. Location: On-site, India — join us to architect the next generation of cloud-native data platforms and accelerate your career as a Senior AWS Data Engineer. Skills: aws step functions,sql,hudi,apache airflow,terraform,kafka,aws,aws data engineer (spark scala),iceberg,git,delta lake,kinesis,lambda,spark,scala,python,etl,cloudformation
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Grade Level (for internal use): 10 The Team We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. What’s In It For You Opportunity to contribute to the development of a world-class Platform Engineering team. Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of data—ideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks. Grow your career within a globally distributed team, with clear opportunities for advancement and skill development. Responsibilities Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications Bachelor's/Master’s Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Kafka, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks, SPARK, Scala technologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications Hands-on experience with cloud computing platforms including AWS, Azure, or Google Cloud Platform (GCP). Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316914 Posted On: 2025-06-23 Location: Hyderabad, Telangana, India
Posted 1 day ago
4.0 years
8 - 24 Lacs
Hyderabad, Telangana, India
On-site
About The Opportunity We are a fast-growing provider of cloud data engineering services within the Information Technology & Services sector. Our teams build high-throughput data platforms on Amazon Web Services for Fortune 500 and digitally native clients across retail, fintech, and healthcare. Leveraging open-source big-data frameworks and modern DevOps practices, we turn raw, real-time streams into actionable insights that drive mission-critical decisions. Role & Responsibilities Design, build, and optimize secure AWS data lakes and lakehouses for petabyte-scale analytics. Develop Spark (Scala) ETL jobs on EMR/Glue, ensuring efficient ingestion, transformation, and partitioning. Implement automated data quality checks, schema validation, and performance tuning for sub-second query latency. Orchestrate pipelines with AWS Step Functions, Airflow, and CI/CD to enable reliable, repeatable deployments. Collaborate with Data Scientists to productionize machine-learning features and batch/stream jobs. Document architecture, promote engineering best practices, and mentor junior engineers on cloud-native patterns. Skills & Qualifications Must-Have 4+ years designing data solutions on AWS (S3, Glue, EMR, Redshift, Lake Formation). Expertise in Spark with Scala, including performance diagnostics and cluster optimization. Proven experience building end-to-end ETL/ELT pipelines and implementing data governance. Strong SQL skills and exposure to columnar file formats (Parquet, ORC). Hands-on with Terraform/CloudFormation, Git, and automated testing frameworks. Bachelor’s degree in Computer Science, Engineering, or equivalent. Preferred Real-time streaming with Kafka/Kinesis and Lambda. Experience migrating on-prem Hadoop workloads to AWS. Knowledge of Delta Lake, Iceberg, or Hudi table formats. Exposure to ML Ops and feature stores on SageMaker. Benefits & Culture Highlights On-site innovation lab equipped with the latest AWS tooling for rapid prototyping. Dedicated learning budget covering AWS certifications and industry conferences. Collaborative, high-trust culture that rewards thought leadership and measurable impact. Location: On-site, India — join us to architect the next generation of cloud-native data platforms and accelerate your career as a Senior AWS Data Engineer. Skills: aws step functions,sql,hudi,apache airflow,terraform,kafka,aws,aws data engineer (spark scala),iceberg,git,delta lake,kinesis,lambda,spark,scala,python,etl,cloudformation
Posted 1 day ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: Data Engineer Location: Hyderabad Experience: 7-9 Years Experience: 7-9 years of experience with data analytics, data modeling, and database design. 3+ years of coding and scripting (Python, Java, Scala) and design experience. 3+ years of experience with Spark framework. 5+ Experience with ELT methodologies and tools. 5+ years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC. Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. Strong data analysis skills for extracting insights from financial data Proficiency in reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. Familiarity with regulatory requirements and compliance standards in the investment management industry. Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion.
Posted 1 day ago
4.0 years
8 - 24 Lacs
Pune, Maharashtra, India
On-site
About The Opportunity We are a fast-growing provider of cloud data engineering services within the Information Technology & Services sector. Our teams build high-throughput data platforms on Amazon Web Services for Fortune 500 and digitally native clients across retail, fintech, and healthcare. Leveraging open-source big-data frameworks and modern DevOps practices, we turn raw, real-time streams into actionable insights that drive mission-critical decisions. Role & Responsibilities Design, build, and optimize secure AWS data lakes and lakehouses for petabyte-scale analytics. Develop Spark (Scala) ETL jobs on EMR/Glue, ensuring efficient ingestion, transformation, and partitioning. Implement automated data quality checks, schema validation, and performance tuning for sub-second query latency. Orchestrate pipelines with AWS Step Functions, Airflow, and CI/CD to enable reliable, repeatable deployments. Collaborate with Data Scientists to productionize machine-learning features and batch/stream jobs. Document architecture, promote engineering best practices, and mentor junior engineers on cloud-native patterns. Skills & Qualifications Must-Have 4+ years designing data solutions on AWS (S3, Glue, EMR, Redshift, Lake Formation). Expertise in Spark with Scala, including performance diagnostics and cluster optimization. Proven experience building end-to-end ETL/ELT pipelines and implementing data governance. Strong SQL skills and exposure to columnar file formats (Parquet, ORC). Hands-on with Terraform/CloudFormation, Git, and automated testing frameworks. Bachelor’s degree in Computer Science, Engineering, or equivalent. Preferred Real-time streaming with Kafka/Kinesis and Lambda. Experience migrating on-prem Hadoop workloads to AWS. Knowledge of Delta Lake, Iceberg, or Hudi table formats. Exposure to ML Ops and feature stores on SageMaker. Benefits & Culture Highlights On-site innovation lab equipped with the latest AWS tooling for rapid prototyping. Dedicated learning budget covering AWS certifications and industry conferences. Collaborative, high-trust culture that rewards thought leadership and measurable impact. Location: On-site, India — join us to architect the next generation of cloud-native data platforms and accelerate your career as a Senior AWS Data Engineer. Skills: aws step functions,sql,hudi,apache airflow,terraform,kafka,aws,aws data engineer (spark scala),iceberg,git,delta lake,kinesis,lambda,spark,scala,python,etl,cloudformation
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 10 The Team We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. What’s In It For You Opportunity to contribute to the development of a world-class Platform Engineering team. Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of data—ideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks. Grow your career within a globally distributed team, with clear opportunities for advancement and skill development. Responsibilities Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications Bachelor's/Master’s Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Kafka, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks, SPARK, Scala technologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications Hands-on experience with cloud computing platforms including AWS, Azure, or Google Cloud Platform (GCP). Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316914 Posted On: 2025-06-23 Location: Hyderabad, Telangana, India
Posted 1 day ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Data Architect - Azure with MS Fabric Location: Pune/Bangalore/Hyderabad Experience: 12+ Years Role Overview: As a Data Architect specializing in Azure with MS Fabric, you will play a pivotal role in designing and implementing robust data solutions that leverage Microsoft Fabric for cloud-based data management and analytics. Your expertise will guide clients through the complexities of data architecture, ensuring seamless integration with existing systems and optimizing data workflows. You will be responsible for leading projects from inception to completion, providing strategic insights and technical leadership throughout the process. Required Skills and Qualifications: • Experience: 12+ years in Data and Analytics, with a minimum of 7-8 years focused on Azure and at least 2 implementations using Microsoft Fabric. • Data Architecture Expertise: Proven experience as a Data Architect, particularly in consulting and solution design, with a strong background in cloud data stacks. • Technical Proficiency: Extensive knowledge of data modeling, database design, ETL processes, and data governance principles. • MS Fabric: Hands-on experience with Microsoft Fabric, including data integration, data pipelines, and analytics capabilities. • SQL Skills: Advanced SQL knowledge with experience in writing complex queries, performance tuning, and troubleshooting. • Programming Skills: Proficiency in programming languages such as Java, Python, or Scala for building data pipelines. • Methodologies: Familiarity with Agile, Scrum, and other project delivery methodologies. • Stakeholder Management: Strong experience in managing both internal and external stakeholders effectively. • Certifications: Relevant certifications in Azure and Microsoft Fabric will be an advantage. Key Responsibilities: Leadership & Strategy • Lead the design and implementation of end-to-end solutions using Microsoft Fabric. • Collaborate with business and technical stakeholders to define data strategies. • Act as the primary point of contact for all Fabric-related projects and initiatives. • Provide mentorship and guidance to junior data engineers, BI developers, and analysts. Architecture & Development • Design and manage Lakehouses, Data Warehouses, and Pipelines within Microsoft Fabric. • Build scalable data models and visualizations using Power BI (with Fabric Integration). • Develop and maintain Dataflows, Notebooks, Spark Jobs, and Synapse Pipelines. • Implement best practices in data governance, security, and compliance using Fabric’s tools. Project Execution • Lead cross-functional teams for successful project delivery. • Ensure alignment of architecture with business KPIs and OKRs. • Drive adoption of Fabric across business units. • Perform code reviews and architectural assessments. Monitoring & Optimization • Monitor data pipeline performance, troubleshoot issues, and tune performance. • Ensure data quality, availability, and lineage using Microsoft Purview (or native Fabric tooling). • Maintain documentation of data models, architecture, and workflows.
Posted 1 day ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary: We're seeking an experienced Lead/Technical Specialist Data Engineer to design, develop, and implement scalable data solutions using Azure and Microsoft Fabric. You'll lead data engineering initiatives, collaborate with cross-functional teams, and drive data-driven decision-making. Key Responsibilities: Design and implement data pipelines, data warehouses, and data lakes using Azure and MS Fabric. Lead data engineering teams, mentor junior engineers, and ensure best practices. Collaborate with stakeholders to understand data requirements and develop solutions. Develop and maintain large-scale data systems, ensuring performance, security, and reliability. Troubleshoot data issues, optimize data workflows, and implement data governance. Required Skills: 7+ years of experience in data engineering, with expertise in Azure (Data Factory, Databricks, Synapse Analytics) and MS Fabric. Strong programming skills in languages like Python, Scala, or SQL. Experience with data warehousing, ETL, and data governance. Strong understanding of data architecture, data modeling, and data security. Excellent problem-solving skills, with ability to troubleshoot complex data issues. Strong communication and leadership skills, with experience in mentoring teams
Posted 1 day ago
0.0 - 8.0 years
0 Lacs
Hyderabad, Telangana
On-site
About the Role: Grade Level (for internal use): 10 The Team: We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact: Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. What’s in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of data—ideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications: Bachelor's/Master’s Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Kafka, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks , SPARK , Scala technologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316914 Posted On: 2025-06-23 Location: Hyderabad, Telangana, India
Posted 1 day ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Data Architect - Azure with MS Fabric Location: Pune/Bangalore/Hyderabad Experience: 12+ Years Role Overview: As a Data Architect specializing in Azure with MS Fabric, you will play a pivotal role in designing and implementing robust data solutions that leverage Microsoft Fabric for cloud-based data management and analytics. Your expertise will guide clients through the complexities of data architecture, ensuring seamless integration with existing systems and optimizing data workflows. You will be responsible for leading projects from inception to completion, providing strategic insights and technical leadership throughout the process. Required Skills and Qualifications: • Experience: 12+ years in Data and Analytics, with a minimum of 7-8 years focused on Azure and at least 2 implementations using Microsoft Fabric. • Data Architecture Expertise: Proven experience as a Data Architect, particularly in consulting and solution design, with a strong background in cloud data stacks. • Technical Proficiency: Extensive knowledge of data modeling, database design, ETL processes, and data governance principles. • MS Fabric: Hands-on experience with Microsoft Fabric, including data integration, data pipelines, and analytics capabilities. • SQL Skills: Advanced SQL knowledge with experience in writing complex queries, performance tuning, and troubleshooting. • Programming Skills: Proficiency in programming languages such as Java, Python, or Scala for building data pipelines. • Methodologies: Familiarity with Agile, Scrum, and other project delivery methodologies. • Stakeholder Management: Strong experience in managing both internal and external stakeholders effectively. • Certifications: Relevant certifications in Azure and Microsoft Fabric will be an advantage. Key Responsibilities: Leadership & Strategy • Lead the design and implementation of end-to-end solutions using Microsoft Fabric. • Collaborate with business and technical stakeholders to define data strategies. • Act as the primary point of contact for all Fabric-related projects and initiatives. • Provide mentorship and guidance to junior data engineers, BI developers, and analysts. Architecture & Development • Design and manage Lakehouses, Data Warehouses, and Pipelines within Microsoft Fabric. • Build scalable data models and visualizations using Power BI (with Fabric Integration). • Develop and maintain Dataflows, Notebooks, Spark Jobs, and Synapse Pipelines. • Implement best practices in data governance, security, and compliance using Fabric’s tools. Project Execution • Lead cross-functional teams for successful project delivery. • Ensure alignment of architecture with business KPIs and OKRs. • Drive adoption of Fabric across business units. • Perform code reviews and architectural assessments. Monitoring & Optimization • Monitor data pipeline performance, troubleshoot issues, and tune performance. • Ensure data quality, availability, and lineage using Microsoft Purview (or native Fabric tooling). • Maintain documentation of data models, architecture, and workflows.
Posted 1 day ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Senior Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). •Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities: L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. · Round the clock threat monitoring & detection · Analysis of any suspicious, malicious, and abnormal behavior. · Alert triage, Initial assessment, incident validation, its severity & urgency · Prioritization of security alerts and creating Incidents as per SOPs. · Reporting & escalation to stakeholders · Post-incident Analysis · Consistent incident triage & recommendations using playbooks. · Develop & maintain incident management and incident response policies and procedures. · Preservation of security alerts and security incidents artefacts for forensic purpose. · Adherence to Service Level Agreements (SLA) and KPIs. · Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory skill sets: Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred skill sets: SOC - Splunk Years of experience required: 2-5 Years Education qualification: B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills SOC Operations Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Methodology, Analytical Thinking, Azure Data Factory, Communication, Creativity, Cybersecurity, Cybersecurity Framework, Cybersecurity Policy, Cybersecurity Requirements, Cybersecurity Strategy, Embracing Change, Emotional Regulation, Empathy, Encryption Technologies, Inclusion, Intellectual Curiosity, Learning Agility, Managed Services, Optimism, Privacy Compliance, Regulatory Response, Security Architecture {+ 8 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Senior Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). •Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities: L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. · Round the clock threat monitoring & detection · Analysis of any suspicious, malicious, and abnormal behavior. · Alert triage, Initial assessment, incident validation, its severity & urgency · Prioritization of security alerts and creating Incidents as per SOPs. · Reporting & escalation to stakeholders · Post-incident Analysis · Consistent incident triage & recommendations using playbooks. · Develop & maintain incident management and incident response policies and procedures. · Preservation of security alerts and security incidents artefacts for forensic purpose. · Adherence to Service Level Agreements (SLA) and KPIs. · Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory skill sets: Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred skill sets: SOC - Splunk Years of experience required: 2-5 Years Education qualification: B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science a Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills SOC Operations Optional Skills SoCs Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Senior Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). •Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities: L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. · Round the clock threat monitoring & detection · Analysis of any suspicious, malicious, and abnormal behavior. · Alert triage, Initial assessment, incident validation, its severity & urgency · Prioritization of security alerts and creating Incidents as per SOPs. · Reporting & escalation to stakeholders · Post-incident Analysis · Consistent incident triage & recommendations using playbooks. · Develop & maintain incident management and incident response policies and procedures. · Preservation of security alerts and security incidents artefacts for forensic purpose. · Adherence to Service Level Agreements (SLA) and KPIs. · Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory skill sets: Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred skill sets: SOC - Splunk Years of experience required: 2-5 Years Education qualification: B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills SoCs Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Methodology, Analytical Thinking, Azure Data Factory, Communication, Creativity, Cybersecurity, Cybersecurity Framework, Cybersecurity Policy, Cybersecurity Requirements, Cybersecurity Strategy, Embracing Change, Emotional Regulation, Empathy, Encryption Technologies, Inclusion, Intellectual Curiosity, Learning Agility, Managed Services, Optimism, Privacy Compliance, Regulatory Response, Security Architecture {+ 8 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
6.0 years
10 - 20 Lacs
Hyderabad, Telangana, India
On-site
Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git
Posted 1 day ago
2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Risk Management Level Senior Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). •Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities: L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. · Round the clock threat monitoring & detection · Analysis of any suspicious, malicious, and abnormal behavior. · Alert triage, Initial assessment, incident validation, its severity & urgency · Prioritization of security alerts and creating Incidents as per SOPs. · Reporting & escalation to stakeholders · Post-incident Analysis · Consistent incident triage & recommendations using playbooks. · Develop & maintain incident management and incident response policies and procedures. · Preservation of security alerts and security incidents artefacts for forensic purpose. · Adherence to Service Level Agreements (SLA) and KPIs. · Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory skill sets: Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred skill sets: SOC - Splunk Years of experience required: 2-5 Years Education qualification: B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills SOC Operations Optional Skills SoCs Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
6.0 years
10 - 20 Lacs
Kochi, Kerala, India
On-site
Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git
Posted 1 day ago
6.0 years
10 - 20 Lacs
Pune, Maharashtra, India
On-site
Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git
Posted 1 day ago
6.0 years
10 - 20 Lacs
Mumbai Metropolitan Region
On-site
Senior Data Engineer (On-Site, India) About The Opportunity A high-growth innovator in the Analytics & Enterprise Data Management sector, we architect and deliver cloud-native data platforms that power real-time reporting, AI/ML workloads, and intelligent decisioning for global retail, fintech, and manufacturing leaders. Our engineering teams transform raw, high-velocity data into trusted, analytics-ready assets that drive revenue acceleration and operational excellence. Role & Responsibilities Design, build, and optimise batch and streaming ETL/ELT pipelines on Apache Spark and Kafka, ensuring sub-minute latency and 99.9% uptime. Develop modular, test-driven Python code to ingest, cleanse, and enrich terabyte-scale datasets from relational, NoSQL, and API sources. Model data for analytics and AI, implementing star/snowflake schemas, partitioning, and clustering in BigQuery, Redshift, or Snowflake. Automate workflow orchestration with Apache Airflow, defining DAGs, dependency management, and robust alerting for SLA adherence. Collaborate with Data Scientists and BI teams to expose feature stores, curated marts, and self-service semantic layers. Enforce data-governance best practices—lineage, cataloguing, RBAC, and encryption—in compliance with GDPR and SOC 2 standards. Skills & Qualifications Must-Have 3–6 years hands-on engineering large-scale data pipelines in production. Expertise in Python and advanced SQL for ETL, optimisation, and performance tuning. Proven experience with Spark (PySpark or Scala) and streaming technologies such as Kafka or Kinesis. Deep knowledge of relational modelling, data-warehousing concepts, and at least one cloud DWH (BigQuery, Redshift, or Snowflake). Solid command of CI/CD, Git workflows, and containerisation (Docker). Preferred Exposure to infrastructure-as-code (Terraform, CloudFormation) and Kubernetes. Experience integrating ML feature stores and monitoring data quality with Great Expectations or similar tools. Certification on AWS, GCP, or Azure data services. Benefits & Culture Highlights On-site, engineer-first environment with dedicated lab space and latest Mac/Linux gear. Rapid career progression through technical mentorship, sponsored certifications, and conference budgets. Inclusive, innovation-driven culture that rewards outcome ownership and creative problem-solving. Ready to architect next-gen data pipelines that power AI at scale? Apply now and join a mission-focused team turning data into competitive advantage. Skills: airflow,docker,snowflake,apache spark,data engineering,redshift,sql,pyspark,python,data modeling,bigquery,ci/cd,apache airflow,data warehousing,spark,etl,kafka,git
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Scala is a popular programming language that is widely used in India, especially in the tech industry. Job seekers looking for opportunities in Scala can find a variety of roles across different cities in the country. In this article, we will dive into the Scala job market in India and provide valuable insights for job seekers.
These cities are known for their thriving tech ecosystem and have a high demand for Scala professionals.
The salary range for Scala professionals in India varies based on experience levels. Entry-level Scala developers can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with 5+ years of experience can earn upwards of INR 15 lakhs per annum.
In the Scala job market, a typical career path may look like: - Junior Developer - Scala Developer - Senior Developer - Tech Lead
As professionals gain more experience and expertise in Scala, they can progress to higher roles with increased responsibilities.
In addition to Scala expertise, employers often look for candidates with the following skills: - Java - Spark - Akka - Play Framework - Functional programming concepts
Having a good understanding of these related skills can enhance a candidate's profile and increase their chances of landing a Scala job.
Here are 25 interview questions that you may encounter when applying for Scala roles:
As you explore Scala jobs in India, remember to showcase your expertise in Scala and related skills during interviews. Prepare well, stay confident, and you'll be on your way to a successful career in Scala. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane