Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Databricks Engineer Location: Pune (WFO) Experience: 5 to 8 years Notice: Immediate Skillset : Databricks platform expertise CI/CD and Data pipeline Good SQL, Python, and PySpark knowledge Unit test case implementation experience MongoDB, Aggregation A minimum of 5 years of experience.
Posted 1 week ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs How You Will Contribute You will: Operationalize and automate activities for efficiency and timely production of data visuals Assist in providing accessibility, retrievability, security and protection of data in an ethical manner Search for ways to get new data sources and assess their accuracy Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation Validate information from multiple sources. Assess issues that might prevent the organization from making maximum use of its information assets What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data Ability to simplify complex problems and communicate to a broad audience In This Role As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Data Science Analytics & Data Science
Posted 1 week ago
2.0 years
5 - 7 Lacs
Gurgaon
On-site
About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is seeking an Advanced Data Engineer specializing in data modeling and reporting with Azure Analysis Services and Power BI. As a key member of the team, you will contribute to the development and support of Gartner’s Enterprise Data Warehouse and a variety of data products. This role involves integrating data from both internal and external sources using diverse ingestion APIs. You will have the opportunity to work with a broad range of data technologies, focusing on building and optimizing data pipelines, as well as supporting, maintaining, and enhancing existing business intelligence solutions. What you will do: Develop, manage, and optimize enterprise data models within Azure Analysis Services, including configuration, scaling, and security management Design and build tabular data models in Azure Analysis Services for seamless integration with Power BI Write efficient SQL queries and DAX (Data Analysis Expressions) to support robust data models, reports, and dashboards Tune and optimize data models and queries for maximum performance and efficient data retrieval Design, build, and automate data pipelines and applications to support data scientists and business users with their reporting and analytics needs Collaborate with a team of Data Engineers to support and enhance the Azure Synapse Enterprise Data Warehouse environment What you will need: 2–4 years of hands-on experience developing enterprise data models in Azure Analysis Services Strong expertise in designing and developing tabular models using Power BI and SQL Server Data Tools (SSDT) Advanced proficiency in DAX for data analysis and SQL for data manipulation and querying Proven experience creating interactive Power BI dashboards and reports for business analytics Deep understanding of relational database systems and advanced SQL skills Experience with T-SQL, ETL processes, and Azure Data Factory is highly desirable Solid understanding of cloud computing concepts and experience with Azure services such as Azure Data Factory, Azure Blob Storage, and Azure Active Directory Nice to Have: Experience with version control systems (e.g., Git, Subversion) Familiarity with programming languages such as Python or Java Knowledge of various database technologies (NoSQL, Document, Graph databases, etc.) Experience with Data Intelligence platforms like Databricks Who you are: Effective time management skills and ability to meet deadlines Excellent communications skills interacting with technical and business audience’s Excellent organization, multitasking, and prioritization skills Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Intellectual curiosity, passion for technology and keeping up with new trends Delivering project work on-time within budget with high quality Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-PM3 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101546 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.
Posted 1 week ago
4.0 years
0 Lacs
Gurgaon
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Data Scientist II Job Title – Data Scientist II – Data & Analytics Our Purpose We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. We cultivate a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation and delivers better business results. Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships, and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Our Team We are an Artificial Intelligence Centre of Excellence working on initiatives in Core and Commercial Payments. Our focus is to create value and improvements through digital intervention inspired by state of the art AI and Machine Learning. As part of the team, you will play a key role in building new AI / ML models, monitoring long term performance, innovate with research – all while creating significant business impact. Are you excited about coding and the solutions we can build with it? Do you love doing hands on work with opportunities to learn new tech? Do you believe that AI has huge potential to improve business processes? Are you recreational with Mathematics and Statistics? If yes, then this role is for you! The Role The candidate, will be working on numerous AI and ML initiatives, spanning across different use cases and stages of delivery. You will expected to work and code hands-on, keeping up to date with latest best practices and advances in the field of AI. You will be required to work closely in collaboration with multiple internal business groups across Mastercard. You are also responsible for creating design documents, including data models, data flow diagrams, and system architecture diagrams. All about You Majors in Computer Science, Data Science, Analytics, Mathematics, Statistics, or a related engineering field or equivalent work experience 4+ Years of experience in using Python with knowledge of client server architecture 2+ Years of experience on building, deploying and maintaining ML models 1+ Years of experience in working on Gen AI projects including knowledge of modern frameworks like LangChain, LangGraph, OpenAI Chat Completion APIs Demonstrated success interacting with stakeholders to understand technical needs and ensuring analyses and solutions meet their needs effectively. Able to work in a fast-paced, deadline-driven environment as part of a team and as an individual contributor. Ability to easily move between business, analytical, and technical teams and articulate solution requirements for each group. Experience with Enterprise Business Intelligence Platform/Data platform i.e. Tableau, PowerBI, Streamlit will be a plus. Experience with cloud-based (SaaS) solutions, ETL processes or API integrations will be a plus. Experience on Cloud Data Platforms Azure/AWS/Databricks will be a plus. Additional Competencies Excellent English, quantitative, technical, and communication (oral/written) skills Analytical/Problem Solving Strong attention to detail and quality Creativity/Innovation Self-motivated, operates with a sense of urgency Project Management/Risk Mitigation Able to prioritize and perform multiple tasks simultaneously Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About The Team About Workday At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. About The Team The Enterprise Data & AI Technologies and Architecture (EDATA) organization is a dynamic and evolving team that is spearheading Workday’s growth through trusted data excellence, innovation, and architectural thought leadership. Equipped with an array of skills in data science, engineering, and analytics, this team orchestrates the flow of data across our growing company while ensuring data accessibility, accuracy, and security. With a relentless focus on innovation and efficiency, Workmates in EDATA enable the transformation of complex data sets into actionable insights that fuel strategic decisions and position Workday at the forefront of the technology industry. EDATA is a global team distributed across the U.S, India and Canada. About The Role Join a pioneering organization at the forefront of technological advancement, dedicated to leveraging data-driven insights to transform industries and drive innovation. We are seeking a highly skilled and motivated Data Quality Engineer to join our dynamic team. The ideal candidate is someone who loves to learn, is detail oriented, has exceptional critical thinking and analytical skills. As a Data Quality Engineer, you will play a critical role in ensuring the accuracy, consistency, and completeness of our data across the enterprise data platform. You will be responsible for designing, developing, and implementing data quality processes, standards, and best practices across various data sources and systems to identify, resolve data issues. This role offers an exciting opportunity to learn, collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to drive data quality improvements and enhance decision-making capabilities. Responsibilities The incumbent will be responsible for (but not limited to) the following: Design and automate data quality checks; resolve issues and improve data pipelines with engineering and product teams. Collaborate with stakeholders to define data quality requirements and best practices. Develop test automation strategies and integrate checks into CI/CD pipelines. Monitor data quality metrics, identify root causes, and drive continuous improvements. Provide guidance on data quality standards across projects. Work with Data Ops to address production issues and document quality processes. About You Basic Qualifications 5+ years of experience as a Data Quality Engineer in data quality management or data governance. Good understanding of data management concepts, including data profiling, data cleansing, and data integration. Proficiency in SQL for data querying and manipulation. Develop and execute automated data quality tests using tools like SQL, Python (Pyspark), and data quality frameworks. Hands-on experience with cloud platforms (AWS/GCP), data warehouses (Snowflake, Databricks, Redshift), and integration tools (Snaplogic, dbt, Talend, etc.) Exposure to data quality tools (e.g., Acceldata, Tricentis) and CI/CD or DevOps practices is a plus. Experience with data quality monitoring tools (Acceldata, Tricentis) a plus. Other Qualifications Proven ability to prioritize and manage multiple tasks in a fast-paced environment. Certification in relevant technologies or data management disciplines is a plus. Analytical mindset with the ability to think strategically and make data-driven decisions. If you are a results-driven individual with a passion for data and analytics and a proven track record in data quality assurance, we invite you to apply for this exciting opportunity. Join our team and contribute to the success of our data-driven initiatives. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Roles and Responsibilities : Develop high-quality code in Python using PySpark, SQL, Flink/Spark Streaming and other relevant technologies. Design, develop, test, deploy, and maintain large-scale data processing pipelines using Azure Databricks . Troubleshoot issues related to On-prem(Hadoop) / Databricks clusters, and big data processing tasks Develop complex SQL queries to extract insights from large datasets stored in relational databases such as PostgreSQL Desired Candidate Profile : 6-8 years of experience in software development with expertise in BI & Analytics domain. Bachelor's degree in Any Specialization (B.Tech/B.E.). Strong understanding of cloud computing concepts on Microsoft Azure platform. Proficiency in programming languages such as Python with hands-on experience working with PySpark.
Posted 1 week ago
12.0 years
1 - 3 Lacs
Hyderābād
On-site
Overview: Seeking a Manager, Data Operations, to support our growing data organization. In this role, you will play a key role in maintaining data pipelines and corresponding platforms (on-prem and cloud) while collaborating with global teams on DataOps initiatives. Manage the day-to-day operations of data pipelines, ensuring governance, reliability, and performance optimization on Microsoft Azure. This role requires hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, real-time streaming architectures, and DataOps methodologies. Ensure availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in implementing real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Contribute to the development of governance models and execution roadmaps to optimize efficiency across data platforms, including Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to enhance enterprise-wide data operations. Collaborate on building and supporting next-generation Data & Analytics platforms while fostering an agile and high-performing DataOps team. Support the adoption of Data & Analytics technology transformations, ensuring full sustainment capabilities and automation for proactive issue identification and resolution. Partner with cross-functional teams to drive process improvements, best practices, and operational excellence within DataOps. Responsibilities: Support the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Assist in managing end-to-end data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Ensure seamless batch, real-time, and streaming data processing while focusing on high availability and fault tolerance. Contribute to DataOps automation initiatives, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps, Terraform, and Infrastructure-as-Code (IaC). Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to enable data-driven decision-making. Work with IT, data stewards, and compliance teams to align DataOps practices with regulatory and security requirements. Support data operations and sustainment efforts, including testing and monitoring processes to support global products and projects. Assist in data capture, storage, integration, governance, and analytics initiatives, collaborating with cross-functional teams. Manage day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to align data platform capabilities with business needs. Participate in the Agile work intake and management process to support execution excellence for data platform teams. Collaborate with cross-functional teams to troubleshoot and resolve issues related to cloud infrastructure and data services. Assist in developing and automating operational policies and procedures to improve efficiency and service resilience. Support incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric environment, advocating for operational excellence and continuous service improvements. Contribute to building a collaborative, high-performing team culture focused on automation and efficiency in DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity while meeting business goals. Leverage technical expertise in cloud and data operations to improve service reliability and scalability. Qualifications: 12+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 12+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 8+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. 5+ years of experience in a management or lead role, with a focus on DataOps execution and delivery. Hands-on experience with Azure Data Factory (ADF) for orchestrating data pipelines and ETL workflows. Proficiency in Azure Synapse Analytics, Azure Data Lake Storage (ADLS), and Azure SQL Database. Familiarity with Azure Databricks for large-scale data processing (basic troubleshooting or support scope is sufficient if not engineering-focused). Exposure to cloud environments (AWS, Azure, GCP) and understanding of CI/CD pipelines for data operations. Knowledge of structured and semi-structured data storage formats (e.g., Parquet, JSON, Delta). Excellent communication skills, with the ability to empathize with stakeholders and articulate technical concepts to non-technical audiences. Strong problem-solving abilities, prioritizing customer needs and advocating for operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational excellence. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience in supporting mission-critical solutions in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) practices, such as automated issue remediation and scalability improvements. Experience driving operational excellence in complex, high-availability data environments. Ability to collaborate across teams, fostering strong relationships with business and IT stakeholders. Experience in data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong analytical and strategic thinking skills, with the ability to execute plans effectively and drive results. Proven ability to work in a fast-changing, complex environment, adapting to shifting priorities while maintaining productivity.
Posted 1 week ago
5.0 - 8.0 years
6 - 9 Lacs
Hyderābād
On-site
About the Role: Grade Level (for internal use): 10 The Team: We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact: Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. What’s in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of data—ideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications: Bachelor's/Master’s Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks , Scala technologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 317835 Posted On: 2025-07-09 Location: Ahmedabad, Gujarat, India
Posted 1 week ago
40.0 years
2 - 6 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-218849 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jul. 08, 2025 CATEGORY: Information Systems ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: We are seeking an experienced MDM Engineer with 8–12 years of experience to lead development and operations of our Master Data Management (MDM) platforms, with hands-on experience in data engineering experience. This role will involve handling the backend data engineering solution within MDM team. This is a technical role that will require hands-on work. To succeed in this role, the candidate must have strong Data Engineering experience. Candidate must have experience on technologies like (SQL, Python, PySpark, Databricks, AWS, API Integrations etc). Roles & Responsibilities: Develop distributed data pipelines using PySpark on Databricks for ingesting, transforming, and publishing master data Write optimized SQL for large-scale data processing, including complex joins, window functions, and CTEs for MDM logic Implement match/merge algorithms and survivorship rules using Informatica MDM or Reltio APIs Build and maintain Delta Lake tables with schema evolution and versioning for master data domains Use AWS services like S3, Glue, Lambda, and Step Functions for orchestrating MDM workflows Automate data quality checks using IDQ or custom PySpark validators with rule-based profiling Integrate external enrichment sources (e.g., D&B, LexisNexis) via REST APIs and batch pipelines Design and deploy CI/CD pipelines using GitHub Actions or Jenkins for Databricks notebooks and jobs Monitor pipeline health using Databricks Jobs API, CloudWatch, and custom logging frameworks Implement fine-grained access control using Unity Catalog and attribute-based policies for MDM datasets Use MLflow for tracking model-based entity resolution experiments if ML-based matching is applied Collaborate with data stewards to expose curated MDM views via REST endpoints or Delta Sharing Basic Qualifications and Experience: 8 to 13 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced proficiency in PySpark for distributed data processing and transformation Strong SQL skills for complex data modeling, cleansing, and aggregation logic Hands-on experience with Databricks including Delta Lake, notebooks, and job orchestration Deep understanding of MDM concepts including match/merge, survivorship, and golden record creation Experience with MDM platforms like Informatica MDM or Reltio, including REST API integration Proficiency in AWS services such as S3, Glue, Lambda, Step Functions, and IAM Familiarity with data quality frameworks and tools like Informatica IDQ or custom rule engines Experience building CI/CD pipelines for data workflows using GitHub Actions, Jenkins, or similar Knowledge of schema evolution, versioning, and metadata management in data lakes Ability to implement lineage and observability using Unity Catalog or third-party tools Comfort with Unix shell scripting or Python for orchestration and automation Hands on experience on RESTful APIs for ingesting external data sources and enrichment feeds Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica, Reltio etc) Any Data Analysis certification (SQL, Python, PySpark, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. GCF Level 05A
Posted 1 week ago
5.0 - 10.0 years
0 Lacs
Hyderābād
On-site
Category: Software Development/ Engineering Main location: India, Andhra Pradesh, Hyderabad Position ID: J0725-0450 Employment Type: Full Time Position Description: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Azure Databricks Developer Position: Senior Software Engineer Experience: 5-10 Years Category: Software Development/ Engineering Main location: India, Bangalore / Hyderabad / Chennai Position ID: J0725-0450 Employment Type: Full Time Your future duties and responsibilities: Azure data bricks developer with 5-10 years of experience We are seeking a skilled Azure Databricks Developer to design, develop, and optimize big data pipelines using Databricks on Azure. The ideal candidate will have strong expertise in PySpark, Azure Data Lake, and data engineering best practices in a cloud environment. Key Responsibilities: Design and implement ETL/ELT pipelines using Azure Databricks and PySpark. Work with structured and unstructured data from diverse sources (e.g., ADLS Gen2, SQL DBs, APIs). Optimize Spark jobs for performance and cost-efficiency. Collaborate with data analysts, architects, and business stakeholders to understand data needs. Develop reusable code components and automate workflows using Azure Data Factory (ADF). Implement data quality checks, logging, and monitoring. Participate in code reviews and adhere to software engineering best practices. Required Skills & Qualifications: 3-5 years of experience in Apache Spark / PySpark. 3-5 years working with Azure Databricks and Azure Data Services (ADLS Gen2, ADF, Synapse). Strong understanding of data warehousing, ETL, and data lake architectures. Proficiency in Python and SQL. Experience with Git, CI/CD tools, and version control practices Required qualifications to be successful in this role: Experience: 5 to 10 Yrs Location: Bangalore /Hyderabad / Chennai Education: BE / B.Tech / MCA / BCA Skills: Azure Databricks, Azure Data Factory, SQL, PySpark, Python Skills: ETL SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
8.0 years
4 - 7 Lacs
Hyderābād
On-site
Welcome to Warner Bros. Discovery… the stuff dreams are made of. Who We Are… When we say, “the stuff dreams are made of,” we’re not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD’s vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what’s next… From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Your New Role : This position will join the Enterprise Data and AI team that supports all brands in the Warner Bros umbrella including WB films in theatrical and home entertainment, DC studios, Consumer Products, games, etc. The ideal candidate is a subject matter expert in data science with exposure to predictive modeling, forecasting, recommendation engines and data analytics. This person will build data pipelines, apply statistical modeling and machine learning, and deliver meaningful insights about customers, products and business strategy WBD to drive data-based decisions As a Staff Data Scientist, you will play a critical role in advancing data-driven solutions to complex business challenges, influencing data strategy efforts for WBD Businesses. The responsibilities include: Analyze complex, high volumes of data from various sources using various tools and data analytics techniques. Partner with stakeholders to understand business questions and provide answers using the most appropriate mathematical techniques. Model Development and Implementation: Design, develop, and implement statistical models, predictive models and machine learning algorithms that inform strategic decisions across various business units. Exploratory Data Analysis: Utilize exploratory data analysis techniques to identify and investigate new opportunities through innovative analytical and engineering methods. Advanced Analytics Solutions: Collaborate with Product and Business stakeholders to understand business challenges and develop sophisticated analytical solutions. Data Automation: Advance automation initiatives that reduce the time spent on data preparation, enabling more focus on strategic analysis. Innovative Frameworks Construction: Develop and enhance frameworks that improve productivity and are intuitive for adoption across other data teams and be abreast with innovative machine learning techniques (e.g., deep learning, reinforcement learning, ensemble methods) and emerging AI technologies to stay ahead of industry trends. Collaborate with data engineering teams to architect and scale robust, efficient data pipelines capable of handling large, complex datasets, ensuring the smooth and automated flow of data from raw collection to insights generation. Deployment of machine learning models into production environments, collaborating with DevOps and engineering teams to ensure smooth integration and scalability. Quality Assurance: Implement robust systems to detect, alert, and rectify data anomalies. Qualifications & Experiences: Bachelor’s degree, MS, or greater in Computer/Data Science, Engineering, Mathematics, Statistics, or related quantitative discipline. 8+ years relevant experience in Data Science. Expertise in a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, random forests, deep learning etc.) and experience with applications of these techniques. Expertise in advanced statistical techniques and concepts (regressions, statistical tests etc.) and experience with application of these tools. A demonstrated track record of utilizing data science to solve business problems in a professional environment. Expertise in SQL and either Python or R, including experience with application deployment packages like R Streamlit or Shiny. Experience with database technologies such as Databricks, Snowflake, and others. Familiarity with BI tools (Power BI, Looker, Tableau) and experience managing workflows in an Agile environment. Strong analytical and problem-solving abilities. Excellent communication skills to effectively convey complex data-driven insights to stakeholders. High attention to detail and capability to work independently in managing multiple priorities under tight deadlines. Proficiency in big data technologies (e.g., Spark, Kafka, Hive). Experience working in a cloud environment (AWS, Azure, GCP) to facilitate data solutions. Ability to collaborate effectively with business partners and develop and maintain productive professional relationships. Experience with adhering to established data management practices and standards. Ability to communicate to all levels of business, prioritize and manage assignments to meet deadlines and establish strong relationships. Interest in movies, games, and comics is a plus. How We Get Things Done… This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. Championing Inclusion at WBD Warner Bros. Discovery embraces the opportunity to build a workforce that reflects a wide array of perspectives, backgrounds and experiences. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law. If you’re a qualified candidate with a disability and you require adjustments or accommodations during the job application and/or recruitment process, please visit our accessibility page for instructions to submit your request.
Posted 1 week ago
0 years
5 - 8 Lacs
Hyderābād
On-site
We’re looking for a Platform Engineer in the domain of Data & Analytics. We seek self-driven candidates who enjoy working with users and having hands-on experience in using Databricks and Azure on a daily basis. Key Responsibilities Working with the customers of the platform to identify emerging needs in the platform offering together with platform Product Owner. Infrastructure Provisioning and Management : Deploying and managing infrastructure resources on Azure. Continuous Integration and Deployment (CI/CD ): Implementing and maintaining CI/CD pipelines to automate build, test, and deployment processes. Monitoring and Logging : Setting up supervising and logging solutions to ensure the availability, performance, and security of the platform. Security and Compliance : Implementing security measures and best practices to protect the platform and its data. Teamwork and Communication : Collaborating with multi-functional teams as well as effective communication and documentation of processes, procedures, and configurations are essential. Solving and Support : Identifying and resolving issues related to infrastructure, deployments, and platform performance. Automation and Optimization: Continuously finding opportunities to automate manual tasks, improve operational efficiency, and optimize resource utilization. Learning and Growth: Staying updated with the latest Azure services, DevOps practices, and industry trends. Actively participating in training programs, certifications, and knowledge-sharing initiatives to improve skills and supply to the team's growth.
Posted 1 week ago
0 years
3 - 9 Lacs
Hyderābād
On-site
We are seeking a skilled Agentic AI Developer to design and implement intelligent agent systems powered by Large Language Models (LLMs) . This role involves developing LLM-based pipelines that can ingest transcripts, documents, or business narratives and generate structured artifacts such as workflows, decision trees, action plans, or contextual recommendations. You will collaborate with cross-functional teams to deploy autonomous AI agents capable of reasoning, planning, memory, and tool usage in enterprise environments — primarily within the Microsoft ecosystem (Azure, Power Platform, Copilot, and M365 integrations). Key Responsibilities Build and deploy autonomous agent systems using frameworks such as LangChain, AutoGen, CrewAI, or Semantic Kernel. Develop pipelines to process natural language input and generate structured outputs tailored to business needs. Implement agentic features such as task orchestration, memory storage, tool integration , and feedback loops. Fine-tune LLMs or apply prompt engineering to optimize accuracy, explainability, and responsiveness. Integrate agents with Microsoft 365 services (Teams, Outlook, SharePoint) and Power Platform components (Dataverse, Power Automate). Collaborate with business and product teams to define use cases, test scenarios, and performance benchmarks. Participate in scenario-based UAT testing, risk evaluation, and continuous optimization. Must-Have Skills Proficiency in Python and hands-on experience with ML/AI libraries and frameworks (Transformers, PyTorch, LangChain). Strong understanding of LLMs (e.g., GPT, Claude, LLaMA, Mistral) and prompt engineering principles. Experience developing agent workflows using ReAct, AutoGen, CrewAI, or OpenAI function calling . Familiarity with Vector Databases (FAISS, Pinecone, Qdrant) and RAG-based architectures . Skills in Natural Language Processing (NLP) : summarization, entity recognition, intent classification. Integration experience with APIs, SDKs , and enterprise tools (preferably Microsoft stack). Preferred Certifications (Candidates with the following certifications will have a strong advantage) : ✅ Microsoft Certified: Azure AI Engineer Associate (AI-102) ✅ Microsoft Certified: Power Platform App Maker (PL-100) ✅ Microsoft 365 Certified: Developer Associate (MS-600) ✅ OpenAI Developer Certifications or Prompt Engineering Badge ✅ Google Cloud Certified: Professional Machine Learning Engineer ✅ NVIDIA Deep Learning Institute Certifications ✅ Databricks Generative AI Pathway (optional)
Posted 1 week ago
0 years
7 - 8 Lacs
Hyderābād
On-site
Are you looking to take your career to the next level? We’re looking for a Junior Software Engineer to join our Data & Analytics Core Data Lake Platform engineering team. We are searching for self-motivated candidates, who will demonstrate modern Agile and DevOps practices to craft, develop, test and deploy IT systems and applications, delivering global projects in multinational teams. P&G Core Data Lake Platform is a central component of P&G data and analytics ecosystem. CDL Platform is used to deliver a broad scope of digital products and frameworks used by data engineers and business analysts. In this role you will have an opportunity to use data engineering skills to deliver solutions enriching data cataloging and data discoverability for our users. With our approach to building solutions that would fit the scale P&G business is operating, we combine software engineering standard methodologies (Databricks) with modern software engineering standards (Azure, DevOps, SRE) to deliver value for P&G. RESPONSIBILITIES: Writing and testing code for Data & Analytics applications and building E2E cloud native (Azure) solutions. Engineering applications throughout its entire lifecycle from development, deployment, upgrade, and replacement/termination Ensuring that development and architecture carry out to established standards, including modern software engineering practices (CICD, Agile, DevOps) Collaborate with internal technical specialists and vendors to develop final products to improve overall performance, efficiency and/or to enable adaptation of new business processes.
Posted 1 week ago
0 years
0 Lacs
India
Remote
Contract Opportunity: Data Engineer (DBT, Databricks, Azure) – Remote We’re hiring a skilled Data Engineer for a 6-month fully remote contract to support our growing data team. You'll work on mission-critical data transformation pipelines using DBT , Databricks , and Azure . What you’ll do: Design and build scalable, maintainable data models using DBT Develop efficient pipelines across cloud infrastructure Integrate and transform diverse data sources for analytics Collaborate with analysts, scientists, and QA teams to ensure data accuracy What we’re looking for: Proven experience with Databricks and Spark Strong knowledge of Azure data services Familiarity with Git , version control, and CI/CD for data workflows Solid problem-solving skills and attention to data integrity Location: Fully Remote (Preference for India-based candidates) Duration: 6 months rolling
Posted 1 week ago
5.0 years
6 - 20 Lacs
Chennai
On-site
Job openings for Databricks Developer in Chennai Looking for an offshore Senior Developer who has experience in Databricks, is willing to learn new technologies if needed and is able to work with team. This position is replacement position for existing Senior developer. Position is long term and will likely be renewed annually. Must be a team player. Must have at least 5 years of IT development experience. Must have strong analytical and problem-solving skills. Must have experience in designing solutions, performing code reviews, mentoring junior engineers. Must have strong SQL and backend experience, and working on data driven projects. Must have following experience: Python/PySpark SQL Databricks SCALA SQL Spark/Spark Streaming Big Data Tool Set Linux Kafka Experience collaborating with dev team, project managers, and engineers. Excellent communication and teamwork skills. Experience 5 - 11 Years Salary 6 Lac To 20 Lac P.A. Industry IT Software - Application Programming / Maintenance Qualification B.C.A, B.Sc, B.E, B.Tech, M.C.A, M.Sc, M.Tech Key Skills SQL Python Pyspark Databricks SCALA Spark Streaming Big Data Tool Linux Kafka
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Internal title: Assoc.Dir. DDIT US&I Data Architect Location: Hyderabad, India Relocation Support: Yes Step into a pivotal role where your expertise in data architecture will shape the future of analytics at Novartis. As Associate Director - Data Architect, you’ll lead the design and implementation of innovative data solutions that empower business decisions and drive digital transformation. This is your opportunity to influence enterprise-wide strategies, collaborate with cross-functional teams, and bring emerging technologies to life—all while making a meaningful impact on global healthcare. About The Role Key Responsibilities Design and implement scalable data architecture solutions aligned with business strategy and innovation goals Lead architecture for US&I Analytics Capabilities including GenAI, MLOps, NLP, and data visualization Collaborate with cross-functional teams to ensure scalable, future-ready data solutions Define and evolve architecture governance frameworks, standards, and best practices Drive adoption of emerging technologies through rapid prototyping and enterprise-scale deployment Architect data solutions using AWS, Snowflake, Databricks, and other modern platforms Oversee delivery of data lake projects including acquisition, transformation, and publishing Ensure data security, governance, and compliance across all architecture solutions Promote a data product-centric approach to solution design and delivery Align innovation efforts with business strategy, IT roadmap, and regulatory requirements Essential Requirements Bachelor’s degree in computer science, engineering, or a related field Over 10 years of experience in analytical and technical frameworks for descriptive and prescriptive analytics Strong expertise in AWS, Databricks, and Snowflake service offerings Proven experience delivering data lake projects from acquisition to publishing Deep understanding of data security, governance policies, and enforcement mechanisms Agile delivery experience managing multiple concurrent delivery cycles Strong knowledge of MLOps and analytical data lifecycle management Excellent communication, problem-solving, and cross-functional collaboration skills Desirable Requirements Experience working with pharmaceutical data and familiarity with global healthcare data sources Exposure to regulatory frameworks and compliance standards in the life sciences industry Commitment To Diversity And Inclusion Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Required Skills With a focus on automating testing and DevOps, use Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting. Working experience of Python. Automate the running of unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle. Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments. If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders. Excellent grasp of and expertise with test-driven development and continuous integration processes. Analysis and Design – Converts high-level design to low-level design and implements it. Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans. Benchmark application code proactively to prevent performance and scalability concerns. Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management. Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple components. Education Bachelor's Degree or equivalent combination of education and experience.
Posted 1 week ago
6.0 years
6 - 9 Lacs
Hosūr
On-site
Hosur Plant, Tamil Nadu, India Department R&D-EV Sys D&D-Digital Twin and HIL Job posted on Jul 08, 2025 Employee Type White Collar Experience range (Years) 6 years - 8 years About TVS Motor TVS Motor Company is a reputed two and three-wheeler manufacturer globally, championing progress through Sustainable Mobility with four state-of-the-art manufacturing facilities in Hosur, Mysuru and Nalagarh in India and Karawang in Indonesia. Rooted in our 100-year legacy of Trust, Value, and Passion for Customers and Exactness, we take pride in making internationally aspirational products of the highest quality through innovative and sustainable processes. We are the only two-wheeler company to have received the prestigious Deming Prize. Our products lead in their respective categories in the J.D. Power IQS and APEAL surveys. We have been ranked No. 1 Company in the J.D. Power Customer Service Satisfaction Survey for consecutive four years. Our group company Norton Motorcycles, based in the United Kingdom, is one of the most emotive motorcycle brands in the world. Our subsidiaries in the personal e-mobility space, Swiss E-Mobility Group (SEMG) and EGO Movement have a leading position in the e-bike market in Switzerland. TVS Motor Company endeavors to deliver the most superior customer experience across 80 countries in which we operate. For more information, please visit www.tvsmotor.com . Job Responsibilities We are seeking a seasoned leader with expertise in data science, AI/ML, and telematics to drive our digital twin initiatives. You will leverage large-scale telemetry to uncover actionable insights that enhance operational efficiency and reduce costs for EV part and components. By integrating advanced analytics, predictive models, and scalable platforms, you will help shape our digital twin strategy and deliver measurable business impact. Qualification Master's degree or more in data science, Computer Science, Engineering, Proven success applying data science and AI/ML to telematics for predictive maintenance, optimization, or operational improvement. Proficiency in Python, PySpark, AWS Glue, TensorFlow, PyTorch, MATLAB, and familiarity with IoT/cloud platforms. Hands-on experience with AWS and/or Databricks for large-scale data processing and model deployment. Effective communication, leadership, and stakeholder management skills, with a record of guiding teams and delivering measurable outcomes. Role Description: 1. Strategic Roadmap & Vision: Develop and execute a long-term digital twin strategy centred on effectively using telematics data to improve operations. Conduct cost-benefit analyses to ensure initiatives align with business goals and deliver strong ROI. Telematics Data Integration & Analytics: Oversee the ingestion, cleansing, and integration of telemetry, ensuring accurate and reliable data for informed decision-making. Collaborate with IT and data engineering teams to establish robust pipelines for real-time and historical data analysis. AI/ML-Driven Insights & Predictive Maintenance: Implement AI/ML models—predictive maintenance, anomaly detection, usage-based insights—that anticipate potential failures, reduce warranty costs, and improve asset utilization. Apply advanced analytics to identify underlying patterns and inform data-driven strategies. Performance Optimization & Scalability: Develop scalable digital twin architectures capable of handling complex datasets and simulations. Continuously evaluate emerging technologies to maintain a competitive technological edge. Standardization & Governance: Define best practices, standards, and governance frameworks for digital twin development and data analysis. Ensure robust security and privacy measures to protect sensitive information. Cross-Functional Collaboration & Stakeholder Engagement: Partner with engineering, operations, product management, and finance to align analytical insights with strategic objectives. Present findings through clear dashboards, reports, and presentations, enabling prompt, informed decisions. Team Leadership & Mentorship: Lead, mentor, and develop a high-performing team of data scientists, analysts, and engineers. Allocate resources, manage priorities, and ensure timely delivery of projects that drive business value.. 2. Behavioral Competencies Ø "Individual Competencies: decision-making, working with teams, confidence, technical know-how, self-motivation skills" Ø Interpersonal Competencies: high energy levels, communication skills, teamwork, persuasiveness, and handling problems Ø Motivational Competencies: motivator, taking initiatives, involving others in taking more initiatives, and leading by example. Ø "Managerial Competencies Ø leadership skills, managing people/teams, quick decision-making, addressing problems, analytical skills, and strategic planning" Ø Integrity: Consistent with words and actions, respond to pressure to act in conflict with their ethics, accept responsibility for mistakes, Doing right things in right way, belief that people are honest, credible, positive intent with track record of desired results Leadership Competencies Leading Innovation & Diverse Teams Adaptability Strategic Thinking Why TVSM? At TVSM, we are always challenging ourselves to build a better, connected & sustainable future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Working at TVSM Software also means flexibility - Choosing between working from home and the office is the norm here. We offer great benefits and rewards, as you'd expect from a world leader in automotive software organization. We are a merit driven, equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, colour, national origin, sex, gender, gender expression, sexual orientation, age, marital status, or disability status.
Posted 1 week ago
0 years
8 - 9 Lacs
Chennai
On-site
Location Chennai, Tamil Nadu, India Job ID R-230957 Date posted 07/07/2025 Job Title: Data Analyst Career Level: C3 Introduction to role Are you ready to drive operational excellence and technical proficiency in data management? As a Data Analyst, you'll play a pivotal role in ensuring the accuracy and efficiency of data processes for the US BBU. Your expertise will support key business functions in achieving strategic objectives, acting as a crucial link between business collaborators and IT. You'll translate sophisticated business needs into actionable data solutions, improving decision-making and operational efficiency. With your analytical skills, you'll guide the development and deployment of innovative data products, encouraging collaboration with cross-functional teams to implement user-centric design approaches and agile methodologies. Are you prepared to manage a team of data professionals, encouraging continuous improvement and innovation? Accountabilities Provide operational and technical support for US BBU data management activities – data quality management, business process workflows, and data management needs for downstream applications and tools. Provide appropriate operational troubleshoot and triaging related to data processing, business user queries, data investigation, and ad-hoc analytics. Perform data validation, reconciliation, and basic ad-hoc analyses to support business teams. Act as a liaison between Commercial/Medical collaborators and IT for customer concerns and issue resolution. Assist in handling access, user roles, and updates across platforms like Sharp. Essential Skills/Experience Quantitative bachelor’s degree from an accredited college or university is required in one of the following or related fields: Engineering, Operations Research, Management Science, Economics, Statistics, Applied Math, Computer Science or Data Science. An advanced degree is preferred (Master's, MBA or PhD). Proficient in PBI, PowerApps [development & troubleshooting], SQL, Python, Databricks, and AWS S3 operations. Strong understanding of data governance, privacy standards, and operational best practices. Excellent communication and influencing skills with proven ability to develop and optimally. Experience working in a business support or operational data management environment. Organization and time management skills. Define and document detailed user stories, acceptance criteria, and non-functional requirements for the data products. Engage with cross-functional collaborators to understand their requirements, problems, and expectations. Advocate for a user-centric design approach, ensuring that the data products are intuitive, accessible, and meet the needs of the target users. Collaborate with the development team to plan and implement agile sprints, ensuring timely delivery of high-quality features. Supervise the data product ecosystem’s Business architecture, design, and development. Monitor industry trends and best practices in data product development and management. Collaborate closely with business collaborators to understand their requirements and translate them into technical solutions. Oversee the end-to-end development lifecycle of the data products, from conceptualization to deployment. Strong leadership and interpersonal skills with demonstrated ability to work collaboratively with a significant number of business leaders and cross-functional business partners. Present succinct, compelling reviews of independently developed analyses infused with insight and business implications/actions to be considered. Strategic and critical thinking with the ability to engage, build and maintain credibility with Commercial Leadership Team. Strong organizational skills and time management; ability to handle diverse range of simultaneous projects. Desirable Skills/Experience Knowledge of AZ brand and Science. Experience of working with multiple 3rd party providers, including information technology partners. Strategic and critical thinking with the ability to engage, build and maintain credibility with Commercial Leadership Team. Understanding of US BBU commercial and medical business functions. Experience with Sharp [Internal AZ platform] administration, Power Apps development or troubleshooting. When we put unexpected teams in the same room, we ignite aggressive thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and bold world. At AstraZeneca, you'll join a dedicated global team that powers our enterprise to better serve patients every day. We use exciting new technology and digital innovations to accelerate our evolution strategically. With a bold spirit that keeps us ahead of the rest, we apply creativity to every task we do. Our fast-paced environment is driven by a passion for helping patients and empowered by camaraderie. Here you'll find countless opportunities to build an unrivalled reputation while being rewarded for your successes. Ready to make an impact? Apply now to join our dynamic team! Date Posted 08-Jul-2025 Closing Date 14-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 1 week ago
8.0 - 12.0 years
3 - 5 Lacs
Noida
On-site
Posted On: 8 Jul 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description General Roles & Responsibilities: Technical Leadership: Demonstrate leadership, and ability to guide business and technology teams in adoption of best practices and standards Design & Development: Design, develop, and maintain robust, scalable, and high-performance data estate Architecture: Architect and design robust data solutions that meet business requirements & include scalability, performance, and security. Quality: Ensure the quality of deliverables through rigorous reviews, and adherence to standards. Agile Methodologies: Actively participate in agile processes, including planning, stand-ups, retrospectives, and backlog refinement. Collaboration: Work closely with system architects, data engineers, data scientists, data analysts, cloud engineers and other business stakeholders to determine optimal solution & architecture that is future-proof too. Innovation: Stay updated with the latest industry trends and technologies, and drive continuous improvement initiatives within the development team. Documentation: Create and maintain technical documentation, including design documents, and architectural user guides. Technical Responsibilities: Optimize data pipelines for performance and efficiency. Work with Databricks clusters and configuration management tools. Use appropriate tools in the cloud data lake development and deployment. Developing/implementing cloud infrastructure to support current and future business needs. Provide technical expertise and ownership in the diagnosis and resolution of issues. Ensure all cloud solutions exhibit a higher level of cost efficiency, performance, security, scalability, and reliability. Manage cloud data lake development and deployment on AWS /Databricks. Manage and create workspaces, configure cloud resources, view usage data, and manage account identities, settings, and subscriptions in Databricks Required Technical Skills: Experience & Proficiency with Databricks platform - Delta Lake storage, Spark (PySpark, Spark SQL). Must be well versed with Databricks Lakehouse, Unity Catalog concept and its implementation in enterprise environments. Familiarity of data design pattern - medallion architecture to organize data in a Lakehouse. Experience & Proficiency with AWS Data Services – S3, Glue, Athena, Redshift etc.| Airflow scheduling Proficiency in SQL and experience with relational databases. Proficiency in at least one programming language (e.g., Python, Java) for data processing and scripting. Experience with DevOps practices - AWS DevOps for CI/CD, Terraform/CDK for infrastructure as code Good understanding of data principles, Cloud Data Lake design & development including data ingestion, data modeling and data distribution. Jira: Proficient in using Jira for managing projects and tracking progress. Other Skills: Strong communication and interpersonal skills. Collaborate with data stewards, data owners, and IT teams for effective implementation Understanding of business processes and terminology – preferably Logistics Experienced with Scrum and Agile Methodologies Qualification Bachelor’s degree in information technology or a related field. Equivalent experience may be considered. Overall experience of 8-12 years in Data Engineering Mandatory Competencies Cloud - AWS Data Science - Databricks Database - SQL Data on Cloud - Azure Data Lake (ADL) Agile - Agile Data Analysis - Data Analysis Big Data - PySpark Data on Cloud - AWS S3 Data on Cloud - Redshift ETL - AWS Glue Python - Python DevOps - CI/CD Beh - Communication and collaboration Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.
Posted 1 week ago
5.0 years
0 Lacs
Noida
On-site
Hello! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role The technology that once promised to simplify patient care has brought more issues than anyone ever anticipated. At Innovaccer, we defeat this beast by making full use of all the data Healthcare has worked so hard to collect, and replacing long-standing problems with ideal solutions. Data is our bread and butter for innovation. We are looking for a Senior AI Engineer who understands healthcare data and can leverage the data to build algorithms to personalize treatments based on the clinical and behavioral history of patients. We are looking for a superstar who will define and build the next generation of predictive analytics tools in healthcare. A Day in the Life Design and build scalable AI platform architecture to support ML development, agentic frameworks, and robust self-serve AI pipelines. Develop agentic frameworks and a catalog of AI agents tailored to healthcare use cases. Design and deploy high-performance, low-latency AI applications. Build and optimize ML/DL models, including generative models like Transformers and GANs. Construct and manage data ingestion and transformation pipelines for scalable AI solutions. Conduct experiments, and statistical analysis, and derive insights to guide development. Collaborate with data scientists, engineers, product managers, and business stakeholders to translate AI innovations into real-world applications. Partner with business leaders and clients to understand pain points and co-create scalable AI-driven solutions. What You Need Master’s in Computer Science, Engineering, or a related field. 5+ years of software engineering experience with strong API development skills. 3+ years of experience in data science and at least 1+ year in building generative AI pipelines, agents, and RAG systems. Strong Python programming skills with enterprise application development and optimization. Experience with: LLMs, prompt engineering, and fine-tuning SLMs. Frameworks like LangChain, CrewAI, or Autogen (at least one is a must). Vector databases (e.g., FAISS, ChromaDB). Embedding models and Retrieval-Augmented Generation (RAG) design. Familiarity with at least one ML platform (Databricks, Azure ML, SageMaker). Bonus: Experience with Docker, Kubernetes, AWS/Azure, Snowflake, and healthcare data systems Preferred Skills Python – building scalable, performant AI applications. Experience with reinforcement learning and multi-agent systems. LLM optimization and deployment at scale. Familiarity with healthcare data and AI use cases. We offer competitive benefits to set you up for success in and outside of work. Here’s What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days. Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition. Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury. Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where and how we work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer Inc. is the data platform that accelerates innovation. The Innovaccer platform unifies patient data across systems and care settings and empowers healthcare organizations with scalable, modern applications that improve clinical, financial, operational, and experiential outcomes. Innovaccer’s EHR-agnostic solutions have been deployed across more than 1,600 hospitals and clinics in the US, enabling care delivery transformation for more than 96,000 clinicians, and helping providers work collaboratively with payers and life sciences companies. Innovaccer has helped its customers unify health records for more than 54 million people and generate over $1.5 billion in cumulative cost savings. The Innovaccer platform is the #1 rated Best-in-KLAS data and analytics platform by KLAS, and the #1 rated population health technology platform by Black Book. For more information, please visit innovaccer.com. Check us out on YouTube, Glassdoor, LinkedIn, and innovaccer.com.
Posted 1 week ago
5.0 - 8.0 years
6 - 9 Lacs
Ahmedabad
On-site
About the Role: Grade Level (for internal use): 10 The Team: We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact: Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. What’s in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of data—ideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications: Bachelor's/Master’s Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks , Scala technologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 317835 Posted On: 2025-07-09 Location: Ahmedabad, Gujarat, India
Posted 1 week ago
4.0 - 6.0 years
12 - 20 Lacs
Hyderabad, Chennai
Work from Office
AWS Data Engineer Employment Type: Full Time Employment Work Mode: Work From Office Job Location: Chennai/Hyderabad Walkin Date: 26-July-2025 Time: 11:00 am to 3:00pm Years of experience: 4 to 6 years Notice: Immediate to 90 days Venue: Agilisium Consulting, World Trade Center, Perungudi, Chennai . Skillset: Python, Pyspark, SQL, AWS, Databricks Airflow-Good to have
Posted 1 week ago
6.0 - 11.0 years
16 - 31 Lacs
Chennai
Work from Office
Join our team as an AWS Data Engineer design scalable data solutions and power insights with cutting-edge cloud tech. Drive innovation in a fast-paced environment where your skills shape the future of data engineering! Would you like it to be more formal, casual, or tailored to a specific industry? Job Title: Data Engineer Years of experience: 6 to 12 years (Minimum 5 years of relevant experience) Work Mode: Work From Office Chennai Notice Period-Immediate to 30 Days only Key Skills: Python, SQL, AWS, Spark, Databricks,Data Modeling - (Mandate) Airflow- Good to have
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France