Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
6 - 12 Lacs
Hyderabad
Work from Office
We are seeking experienced Data Analysts / Data Engineers with strong expertise in U.S. pharmaceutical commercial datasets to support critical Data Operations initiatives. This role will be focused on onboarding third-party data, ensuring data quality , and implementing outlier detection techniques . Familiarity with ML/A I approaches for anomaly detection is highly desirable. Key Responsibilities: Pharma Data Integration: Work extensively with U.S. pharmaceutical commercial datasets. Ingest and onboard third-party data sources such as IQVIA, Symphony Health, Komodo Health etc. Ensure alignment of data schemas, dictionary mapping, and metadata integrity. Data Quality & Governance: Design and implement QC protocols for data integrity and completeness. Track data lineage and maintain proper documentation of data flow and transformations. Outlier Detection & Analytics: Apply statistical or algorithmic techniques to identify anomalies in data related to sales, claims, or patient-level records. Utilize ML/AI tools (if applicable) for automated outlier detection and trend analysis. Collaboration & Reporting: Work cross-functionally with business teams, data scientists, and IT to ensure timely delivery of reliable data. Provide detailed reports and insights for stakeholders to support commercial decision-making. Required Skills & Qualifications: 3+ years of experience in Pharmaceutical Data Operations , preferably with U.S. market data. Strong hands-on experience with third-party commercial healthcare data sources (IQVIA, Symphony, Komodo, etc.). Solid understanding of ETL pipelines, data ingestion frameworks, and metadata management . Proficient in SQL, Python, or R for data processing and quality checks. Experience in outlier detection techniques both statistical (Z-score, IQR, etc.) and ML-based (Isolation Forest, Autoencoders, etc.). Familiarity with Snowflake, Databricks, AWS, or similar cloud platforms is a plus. Excellent problem-solving, documentation, and communication skills
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Responsibilities Developing system integration scopes and objectives, involving all relevant stakeholders, and ensuring technical feasibility Developing and implementing strategies to connect different systems, ensuring they can share data and functionality Defining and managing API that allow different software applications to communicate with each other Defining and managing asynchronous data exchange between applications and as a part of data pipelines Ensuring high data quality and integrity to avoid operational failures and improve system accuracy Diving deep into the details to solve complex technical challenges Leading a partner team of engineers, providing guidance, mentorship, and support to help them grow and succeed in their roles Maintaining comprehensive system integration documentation Overseeing the technical aspects of backend development, including code quality, system architecture, and performance optimization Coordinating cross-functional to optimize processes and ensure that backend systems integrate seamlessly Implementing process improvements, especially cross-team processes, to enhance productivity and efficiency Undertaking technical and functional evaluation of third-party COTS applications Support project management activities, resource monitoring, technical risk identification & mitigations Qualifications Education Qualification Bachelor's or Master's degree in Computer Science, Information Technology, or a related field MBA or other relevant post-graduate qualification is desirable Certification Required Specific certifications in cloud computing or data management would be advantageous Professional certifications in technology, project management, or related fields would be advantageous Experience 10-15 years of experience in technology engineering roles, preferably within the aviation industry Demonstrated track record of successfully implementing, integrating and managing technology systems and applications Proven experience in managing digital transformation initiatives and leveraging technology to enable business growth Strong background in coordinating with cross-functional teams and managing complex technical projects Behavioural Skills Change management and adaptability Problem-solving and analytical mindset Outcome orientation and go getter approach Stakeholder management and partnering skills Team building and mentorship capabilities Technical Skills Proficiency in one programming languages such as Node.js, Python, Java, JavaScript. Expertise in databases like PostgreSQL, MS SQL, and MongoDB. Expertise in APIs, ESB (Enterprise Service Bus), and Kafka. Experience with Data Warehouse solutions like Spark, Databricks, etc Familiarity with cloud platforms like Azure Cloud. Knowledge of data analytics, business intelligence, and AI/ML technologies Understanding of Agile methodologies, Jira, and Confluence. Knowledge in tools like Docker and nginx. Knowledge of cybersecurity principles and practices.
Posted 1 week ago
8.0 years
0 Lacs
Greater Hyderabad Area
Remote
Job Title:AI/ML Engineer / Data Scientist - with Databricks focus Experience: 8+Years Work type: Remote (India) Key Responsibilities: • Develop, deploy, and maintain scalable MLOps pipelines for both traditional ML and Generative AI use cases leveraging Databricks (Unity Catalog, Delta Tables, Inference Tables, Mosaic AI). • Operationalize large language models (LLMs) and other GenAI models, ensuring efficient prompt engineering, fine-tuning, and serving. • Implement model tracking, versioning, and experiment management using MLflow. • Build robust CI/CD pipelines for ML and GenAI workloads to automate testing, validation, and deployment to production. • Use Vertex AI to manage training, deployment, and monitoring of ML and GenAI models in the cloud. • Integrate high-quality, governed data pipelines that enable ML and Generative AI solutions with strong lineage and reproducibility. • Design and enforce AI Governance frameworks covering model explainability, bias monitoring, data access, compliance, and audit trails. • Collaborate with data scientists and GenAI teams to productionize prototypes and research into reliable, scalable products. • Monitor model performance, usage, and drift — including specific considerations for GenAI systems such as hallucination checks, prompt/response monitoring, and user feedback loops. • Stay current with best practices and emerging trends in MLOps and Generative AI. Key Qualifications: Must Have Skills: • 3+ years of experience in MLOps, ML Engineering, or related field. • Hands-on experience with operationalizing ML and Generative AI models in production. • Proficiency with Databricks (Unity Catalog, Delta Tables, Mosaic AI, Inference Tables). • Experience with MLflow for model tracking, registry, and reproducibility. • Strong understanding of Vertex AI pipelines and deployment services. • Expertise in CI/CD pipelines for ML and GenAI workloads (e.g., GitHub Actions, Azure DevOps, Jenkins). • Proven experience in integrating and managing data pipelines for AI, ensuring data quality, versioning, and lineage. • Solid understanding of AI Governance, model explainability, and responsible AI practices. • Proficiency in Python, SQL, and distributed computing frameworks. • Excellent communication and collaboration skills. Nice to Have: • Experience deploying and monitoring Large Language Models (LLMs) and prompt-driven AI workflows. • Familiarity with vector databases, embeddings, and retrieval-augmented generation (RAG) architectures. • Infrastructure-as-Code experience (Terraform, CloudFormation). • Experience working in regulated industries (e.g., finance, Retail) with compliance-heavy AI use cases.
Posted 1 week ago
5.0 years
0 Lacs
India
On-site
At Amaris Consulting, we’re on the lookout for bold, versatile, and forward-thinking individuals to join our Data & AI Center of Excellence as Data Consultants. Whether your strength lies in analytics, engineering, or machine learning—your expertise belongs here. What does it mean to be a Data Consultant at Amaris? As a Data Consultant, you’ll be at the heart of strategic and technical projects for top-tier organizations. From building scalable data pipelines to deploying cutting-edge ML models, your work will directly shape how clients turn raw data into real-world impact. You'll collaborate across teams, industries, and geographies—delivering solutions that matter. Who we’re looking for: Data Engineer You don’t just work with data—you build the engines that power data-driven products. You’re fluent in Python and SQL, and you know how to architect clean, scalable pipelines that deliver results. You’ve worked with AI-enabled solutions, integrating pre-trained models, embeddings, and computer vision into production environments. You love solving problems, thrive in fast-paced product teams, and feel right at home in client-facing settings and global, cross-functional collaborations. 🔥 What You’ll Do as a Data Consultant: Work in cross-functional teams with engineers, scientists, analysts, and project managers Build and optimize data pipelines for AI and product development use cases Collaborate with AI teams to operationalize models, including vision and NLP-based pre-trained systems Participate in client discussions to translate technical needs into valuable solutions Ensure code quality and scalability using best practices in Python and SQL Shape and implement technical solutions across cloud, hybrid, or on-prem environments Support product development initiatives by embedding data capabilities into features Contribute to internal R&D and knowledge-sharing efforts within the CoE Our Environment & Tech Stack: We’re tech-agnostic and pragmatic: we adapt our stack to each client’s needs. Some of the most used technologies include: Languages: Python, SQL AI & ML: Pre-trained models, embedding models, computer vision frameworks Cloud platforms: Azure, AWS, GCP Orchestration & Transformation: Airflow, dbt, Kedro Big Data & Storage: Spark, Databricks, Snowflake MLOps & DevOps: MLflow, Docker, Git, CI/CD pipelines Product & API Development: REST APIs, microservices (bonus) 🎯 Your Profile: 4–5 years of experience as a Data Engineer Excellent skills in Python and SQL Experience with pre-trained models, embeddings, and computer vision Exposure to product development and AI integration in live environments Comfortable in client-facing roles and interacting with international teams Strong communicator with the ability to explain complex topics to both technical and business audiences Fluent in English; additional languages are a plus Autonomous, proactive, and a continuous learner 🚀 Why Join our Data & AI Center of Excellence? Work with major clients across Europe and globally on impactful projects Join a community of 600+ data professionals in our Center of Excellence Access continuous upskilling, tech exchanges, and mentorship opportunities Grow into technical leadership, architecture, or specialized AI domains 💡 We are an independent company that values: Agility – thrive in a flexible, dynamic, and stimulating environment International scope – engage in daily cross-border collaboration and mobility in 60+ countries Intrapreneurship – contribute to transversal topics or launch your own initiatives Attentive management – benefit from personalized support and career development Amaris Consulting is proud to be an equal-opportunity workplace. We are committed to promoting diversity and creating an inclusive work environment. We welcome applications from all qualified individuals, regardless of gender, orientation, background, or ability.
Posted 1 week ago
3.0 years
0 Lacs
Greater Nashik Area
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Title: Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role Contributing to the Data Science efforts of AB InBevʼs global forecasting team. Candidate will be required to contribute to building, interpreting and scaling forecasting models across multiple ABI markets. Key tasks & accountabilities Preferred industry exposure CPG, Consulting with 3+ years (in case of consulting the typical profile would be of a Lead consultant with relevant experience mentioned in the point below) Experience of working in the domain of Forecasting Analytics preferred “preferably in a CPG organization” with a demonstrated capability of successfully deploying analytics solutions and products for internal or external clients. Has interacted with Senior internal or external stakeholders around project/ service conceptualization and plan of delivery. Exposure to AI/ML methodologies with a previous hands-on experience in ML concepts like forecasting, clustering, regression, classification, optimization, deep learning. Has experience of working on data manipulation using tools such as excel, Python. Strong proficiency in Object-Oriented Programming (OOP) principles and design patterns. Good understanding of data structures and algorithms as they relate to machine learning tasks. Experience with version control tools such as Git. Familiarity with MLOPS and containerization tools like Docker would be plus. Consistently display an intent for problem solving Qualifications, Experience, Skills Level Of Educational Attainment Required B.Tech/BE/ Masters in Statistics or Economics/ econometrics, MBA Previous Work Experience Minimum 3 years of relevant experience. Technical Skills Required Hands-on experience in data manipulation using Excel, Python. Expert level proficiency in Python(knowledge of writing end-to-end ML or data pipelines in python) Proficient in application of ML concepts and forecasting techniques to solve end-to-end business problems. Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform. Other Skills Required Passion for solving problems using data Detail oriented, analytical and inquisitive Ability to effectively communicate and present information at various levels of an organization Ability to work independently and with others And above all of this, an undying love for beer! We dream big to create future with more cheers
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will lead a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to shape the vision and drive the execution of transformative data initiatives that make a real impact. Let me tell you about the role As a Senior Master Data Platform Services Manager, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What you will deliver Design and implement enterprise technology architecture, security frameworks, and platform engineering. Strengthen platform security and ensure compliance with industry standards and regulations. Optimize system performance, availability, and scalability. Advance enterprise modernization and drive seamless integration with enterprise IT. Establish governance, security standards, and risk management strategies. Develop automated security monitoring, vulnerability assessments, and identity management solutions. Drive adoption of CI/CD, DevOps, and Infrastructure-as-Code methodologies. Enhance disaster recovery and resilience planning for enterprise platforms. Partner with technology teams and external vendors to align enterprise solutions with business goals. Lead and mentor engineering teams, fostering a culture of innovation and excellence. Shape strategies for enterprise investments, cybersecurity risk mitigation, and operational efficiency! Collaborate across teams to implement scalable solutions and long-term technology roadmaps. What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, Engineering, or a related technical discipline. 6+ years of experience in enterprise technology, security, and operations in large-scale global environments. Experience implementing CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (AWS Cloud Development Kit, Azure Bicep, etc.). Deep knowledge of ITIL, Agile, and enterprise IT governance frameworks. Proficiency in programming languages such as Python, Java, or Scala. Experience with data pipeline frameworks (e.g., Apache Airflow, Kafka, Spark) and cloud-based data platforms (AWS, GCP, Azure). Expertise in database technologies (SQL, NoSQL, Data Lakes) and data modeling principles. Essential Skills Proven technical expertise in Microsoft Azure, AWS, Databricks, and Palantir. Strong understanding of data ingestion, pipelines, governance, security, and visualization. Experience designing, deploying, and optimizing multi-cloud data platforms that support large-scale, cloud-native workloads balancing cost efficiency with performance and resilience. Hands-on performance tuning, data indexing, and distributed query optimization. Experience with real-time, and batch data streaming architectures. Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let me tell you about the role As a Senior Data Platform Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let Me Tell You About The Role As a Senior Data Tooling Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Description Decision Point develops analytics and big data solutions for CPG, Retail, and Consumer-focused industries, working with global Fortune 500 clients. We provide analytical insights and solutions that help develop sales and marketing strategies in the Retail and CPG Industry by leveraging diverse sources of data including Point of Sale data, syndicated category data, primary shipments, and other similar sources. Decision Point was founded by Ravi Shankar along with his classmates from IIT Madras, who have diverse experience across the CPG and Marketing Analytics domain. At Decision Point, you will meet data scientists, business consultants, and tech-savvy engineers passionate about extracting every ounce of value from data for our clients. Role Description This is a full-time on-site role for a Lead Data Engineer, located in Gurugram. The Lead Data Engineer will be responsible for designing, developing, and maintaining data pipelines, building data models, implementing ETL processes, and managing data warehousing solutions. The role also includes data analytics responsibilities to derive actionable insights for our clients. The candidate will engage with cross-functional teams to understand data requirements and deliver robust data solutions. Qualifications Key Responsibilities: Design and build scalable data pipelines using Databricks and Microsoft Fabric Develop and maintain robust ETL processes for efficient data movement Implement and optimize data models – Dimensional & Data Vault Work with data warehousing solutions to ensure clean, consumable data for analytics teams Collaborate with cross-functional teams to deliver end-to-end data flows (ingestion → transformation → consumption) Ensure data accuracy, performance tuning, and problem resolution Required Skills: Databricks (hands-on) Microsoft Fabric (strong working knowledge) SQL – advanced querying, stored procedures, and tuning PySpark / Python – for data transformation Basic knowledge of Azure Data Factory (ADF) Experience with data modeling and data warehousing Strong analytical and problem-solving abilities Ability to work in a fast-paced, collaborative environment
Posted 1 week ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Your Role The technology that once promised to simplify patient care has brought more issues than anyone ever anticipated. At Innovaccer, we defeat this beast by making full use of all the data Healthcare has worked so hard to collect, and replacing long-standing problems with ideal solutions. Data is our bread and butter for innovation. We are looking for a Staff Data Scientist who understands healthcare data and can leverage the data to build algorithms to personalize treatments based on the clinical and behavioral history of patients. We are looking for a superstar who will define and build the next generation of predictive analytics tools in healthcare. Analytics at Innovaccer Our analytics team is dedicated to weaving analytics and data science magics across our products. They are the owners and custodians of intelligence behind our products. With their expertise and innovative approach, they play a crucial role in building various analytical models (including descriptive, predictive, and prescriptive) to help our end-users make smart decisions. Their focus on continuous improvement and cutting-edge methodologies ensures that they're always creating market leading solutions that propel our products to new heights of success A Day in the Life Design and lead the development of various artificial intelligence initiatives to help improve health and wellness of patients Work with the business leaders and customers to understand their pain-points and build large-scale solutions for them Define technical architecture to productize Innovaccer's machine-learning algorithms and take them to market with partnerships with different organizations Proven ability to break down complex business problems into machine learning problems and design solution workflows Work with our data platform and applications team to help them successfully integrate the data science capability or algorithms in their product/workflows Work with development teams to build tools for repeatable data tasks that will accelerate and automate development cycle Define and execute on the quarterly roadmap What You Need Masters in Computer Science, Computer Engineering or other relevant fields (PhD Preferred) 7+ years of experience in Data Science (healthcare experience will be a plus) Strong written and spoken communication skills Strong hands-on experience in Python - building enterprise applications alongwith optimization techniques Strong experience with deep learning techniques to build NLP/Computer vision models as well as state of art GenAI pipelines - knowledge of implementing agentic workflows is a plus Has demonstrable experience deploying deep learning models in production at scale with interactive improvements- would require hands-on expertise with at least 1 deep learning frameworks like Pytorch or Tensorflow Has keen interest in research and stays updated with key advancements in the area of AI and ML in the industry Deep understanding of classical ML techniques - Random Forests, SVM, Boosting, Bagging - and building training and evaluation pipelines Demonstrate experience with global and local model explainability using LIME, SHAP and associated techniques Hands on experience with at least one ML platform among Databricks, Azure ML, Sagemaker s Experience in developing and deploying production ready models Knowledge of implementing an MLOps framework Possess a customer-focused attitude through conversations and documentation We offer competitive benefits to set you up for success in and outside of work. Here's What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where And How We Work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our Px department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details. About Innovaccer Innovaccer Inc. is the data platform that accelerates innovation. The Innovaccer platform unifies patient data across systems and care settings, and empowers healthcare organizations with scalable, modern applications that improve clinical, financial, operational, and experiential outcomes. Innovaccer's EPx-agnostic solutions have been deployed across more than 1,600 hospitals and clinics in the US, enabling care delivery transformation for more than 96,000 clinicians, and helping providers work collaboratively with payers and life sciences companies. Innovaccer has helped its customers unify health records for more than 54 million people and generate over $1.5 billion in cumulative cost savings. The Innovaccer platform is the #1 rated Best-in-KLAS data and analytics platform by KLAS, and the #1 rated population health technology platform by Black Book. For more information, please visit innovaccer.com. Check us out on YouTube, Glassdoor, LinkedIn, and innovaccer.com.
Posted 1 week ago
8.0 - 10.0 years
10 - 12 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
We are seeking a Senior .NET Full Stack Developer with 810 years of experience for a full-time, 6-month remote role. The ideal candidate must possess a strong mix of backend, frontend, data, mobile, and CMS skill sets. Proficiency in Azure, .NET 8, C#, React JS, Tailwind CSS, Python, Power BI, Azure SQL, React Native, and CMS platforms like Sitecore, Tridion, or WordPress is essential. The role demands hands-on development experience and the ability to deliver scalable, end-to-end solutions. Strong analytical thinking, adaptability, and collaboration skills are a must. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 1 week ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Analytics – JD (Azure DE) EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 61,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 12,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Job Title: Consultant / Senior Consultant – Azure Data Engineering Location: India – Gurgaon preferred Industry: Insurance Analytics & AI Vertical Role Overview: We are seeking a hands-on Consultant / Senior Consultant with strong expertise in Azure-based data engineering to support end-to-end development and delivery of data pipelines for our insurance clients. The ideal candidate will have a deep understanding of Azure Data Factory, ADLS, Databricks (preferably with DLT and Unity Catalog), SQL, and Python and be comfortable working in a dynamic, client-facing environment. This is a key offshore role requiring both technical execution and solution-oriented thinking to support modern data platform initiatives. Collaborate with data scientists, analysts, and stakeholders to gather requirements and define data models that effectively support business requirements Demonstrate decision-making, analytical and problem-solving abilities Strong verbal and written communication skills to manage client discussions Familiar with working on Agile methodologies - daily scrum, sprint planning, backlog refinement Key Responsibilities & Skillsets: o Design and develop scalable and efficient data pipelines using Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS). o Build and maintain Databricks notebooks for data ingestion, transformation, and quality checks, using Python and SQL. o Work with Delta Live Tables (DLT) and Unity Catalog (preferred) to improve pipeline automation, governance, and performance. o Collaborate with data architects, analysts, and onshore teams to translate business requirements into technical specifications. o Troubleshoot data issues, ensure data accuracy, and apply best practices in data engineering and DevOps. o Support the migration of legacy SQL pipelines to modern Python-based frameworks. o Ensure adherence to data security, compliance, and performance standards, especially within insurance domain constraints. o Provide documentation, status updates, and technical insights to stakeholders as required. o Excellent communication skills and stakeholder management Required Skills & Experience: 3–7 years of strong hands-on experience in data engineering with a focus on Azure cloud technologies. Proficient in Azure Data Factory, Databricks, ADLS Gen2, and working knowledge of Unity Catalog. Strong programming skills in both SQL, Python especially within Databricks Notebooks. Pyspark expertise is good to have. Experience in Delta Lake / Delta Live Tables (DLT) is a plus. Good understanding of ETL/ELT concepts, data modeling, and performance tuning. Exposure to Insurance or Financial Services data projects is highly preferred. Strong communication and collaboration skills in an offshore delivery model. Required Skills & Experience: Experience working in Agile/Scrum teams Familiarity with Azure DevOps, Git, and CI/CD practices Certifications in Azure Data Engineering (e.g., DP-203) or Databricks What we offer: EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. "EOE/Minorities/Females/Vets/Disabilities"
Posted 1 week ago
9.0 - 14.0 years
30 - 45 Lacs
Chennai, Bengaluru
Work from Office
Educational Qualifications Lead and mentor the team, define goals, ensure timely project delivery, and manage code reviews. Design scalable data architectures, select appropriate technologies, and ensure compliance with data security regulations. Build and optimize ETL/ELT pipelines, automate workflows, and ensure data quality. Hands-on experience with Databricks for building and managing scalable data pipelines. Manage cloud-based infrastructure, implement IaC, and optimize costs and availability. Work with business stakeholders to translate requirements into technical solutions, track project milestones, and manage risks using Agile. Stay updated on new technologies, drive innovation, and optimize existing systems. Maintain documentation, share knowledge, and conduct team training sessions. Educational Qualifications Bachelors or Masters degree in Computer Science, Data Engineering, or related field 9+ years of experience in Data Engineering, with at least 3+ years in an architectural role Strong expertise in data engineering tools and technologies (e.g., Apache Spark, 1 Cloud (AWS or GCP or Azure), SQL, Python). Proficiency in any cloud platforms (e.g., AWS, Azure, GCP) and their data services. Experience with data modeling, ETL/ELT processes, and data warehousing solutions. Knowledge of distributed systems, big data technologies, and real-time data processing. Strong leadership, communication, and problem-solving skills. Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, Terraform). Understanding of data governance, security, and compliance requirements.
Posted 1 week ago
5.0 - 10.0 years
20 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
Notice period strictly Immediate to 30 days . Experience range- 5+ yrs to 12 yrs. Location: PAN India JD 5+ Years of experience using Azure. Strong proficiency in Databricks. Experience with Pyspark. Proficiency in SQL or TSQL. Experience in Azure service components like Azure data factory, Azure data lake, Databricks, SQL DB SQL Server. Databricks Jobs for efficient data processing ETL tasks and report generation. Hands on experience with scripting languages such as Python for data processing and manipulation. Key responsibilities: Leverage Databricks to set up scalable data pipelines that integrate with a variety of data sources and cloud platforms Participate in code and design reviews to maintain high development standards. Optimize data querying layers to enhance performance and support analytical requirements. Should be able to develop end to end automations in Azure stack for ETL workflows data quality validations.
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position: Technical Data Analyst (Snowflake+Pyhton+Sql+Databricks) Client: One of our Prestigious client Locations: Pune/Hyderabad Mode of hiring: Fulltime/Permanent Mode of Interview: face to face Experience: 6+ Years. Budget: 28-33 LPA (Based on exp) Notice Period: 0-15 days (Only serving notice period) Share your CV 📧: sathish.m@tekgence.com Note: PF(UAN) is mandatory (No Dual emp & overlap) We are seeking a highly skilled **Technical Data Analyst** to join our team and play a key role in building a **single source of truth** for our high-volume, direct-to-consumer accounting and financial data warehouse. The ideal candidate will have a strong background in data analysis, SQL, and data transformation, with experience in financial data warehousing and reporting. This role will involve working closely with finance and accounting teams to gather requirements, build dashboards, and transform data to support month-end accounting, tax reporting, and financial forecasting. The financial data warehouse is currently built in **Snowflake** and will be migrated to **Databricks**. The candidate will be responsible for transitioning reporting and transformation processes to Databricks while ensuring data accuracy and consistency. **Key Responsibilities:** 1. **Data Analysis & Reporting:** - Build and maintain **month-end accounting and tax dashboards** using SQL and Snowsight in Snowflake . - Transition reporting processes to **Databricks**, creating dashboards and reports to support finance and accounting teams. - Gather requirements from finance and accounting stakeholders to design and deliver actionable insights. 2. **Data Transformation & Aggregation:** - Develop and implement data transformation pipelines in **Databricks** to aggregate financial data and create **balance sheet look-forward views**. - Ensure data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to optimize data ingestion and transformation processes. 3. **Data Integration & ERP Collaboration:** - Support the integration of financial data from the data warehouse into ** NetSuite ERP ** by ensuring data is properly transformed and validated. - Work with cross-functional teams to ensure seamless data flow between systems. 4. **Data Ingestion & Tools:** - Understand and work with ** Fivetran* * for data ingestion (no need to be an expert, but familiarity is required). - Troubleshoot and resolve data-related issues in collaboration with the data engineering team. Additional Qualifications: - 3+ years of experience as a **Data Analyst** or similar role, preferably in a financial or accounting context. - Strong proficiency in **SQL** and experience with **Snowflake** and **Databricks**. - Experience building dashboards and reports for financial data (e.g., month-end close, tax reporting, balance sheets). - Familiarity with **Fivetran** or similar data ingestion tools. - Understanding of financial data concepts (e.g., general ledger, journals, balance sheets, income statements). - Experience with data transformation and aggregation in a cloud-based environment. - Strong communication skills to collaborate with finance and accounting teams. - Nice-to-have: Experience with **NetSuite ERP** or similar financial systems.
Posted 1 week ago
4.0 years
6 - 10 Lacs
Gurgaon
On-site
About Us We turn customer challenges into growth opportunities. Material is a global strategy partner to the world’s most recognizable brands and innovative companies. Our people around the globe thrive by helping organizations design and deliver rewarding customer experiences. We use deep human insights, design innovation and data to create experiences powered by modern technology. Our approaches speed engagement and growth for the companies we work with and transform relationships between businesses and the people they serve. Srijan, a Material company, is a renowned global digital engineering firm with a reputation for solving complex technology problems using their deep technology expertise and leveraging strategic partnerships with top-tier technology partners Job Summary: We are seeking a Senior Data Engineer – Databricks with a strong development background in Azure Databricks and Python, who will be instrumental in building and optimising scalable data pipelines and solutions across the Azure ecosystem. This role requires hands-on development experience with PySpark , data modelling, and Azure Data Factory. You will collaborate closely with data architects, analysts, and business stakeholders to ensure reliable and high-performance data solutions. Experience Required: 4+ Years Lead/Senior Data Engineer (Microsoft Azure, Databricks, Data Factory, Data Engineer, Data Modelling) Key Responsibilities: Develop and Maintain Data Pipelines: Design, implement, and optimise scalable data pipelines using Azure Databricks (PySpark) for both batch and streaming use cases. Azure Platform Integration: Work extensively with Azure services including Data Factory , ADLS Gen2 , Delta Lake , and Azure Synapse for end-to-end data pipeline orchestration and storage. Data Transformation & Processing: Write efficient, maintainable, and reusable PySpark code for data ingestion, transformation, and validation processes within the Databricks environment. Collaboration: Partner with data architects, analysts, and data scientists to understand requirements and deliver robust, high-quality data solutions. Performance Tuning and Optimisation: Optimise Databricks cluster configurations, notebook performance, and resource consumption to ensure cost-effective and efficient data processing. Testing and Documentation: Implement unit and integration tests for data pipelines. Document solutions, processes, and best practices to enable team growth and maintainability. Security and Compliance: Ensure data governance, privacy, and compliance are upheld across all engineered solutions, following Azure security best practices. Preferred Skills : Strong hands-on experience with Delta Lake , including table management, schema evolution, and implementing ACID-compliant pipelines. Skilled in developing and maintaining Databricks notebooks and jobs for large-scale batch and streaming data processing. Experience writing modular, production-grade PySpark and Python code , including reusable functions and libraries for data transformation. Experience in streaming data ingestion and Structured Streaming in Databricks for near real-time data solutions. Knowledge of performance tuning techniques in Spark – including job optimization, caching, and partitioning strategies. Exposure to data quality frameworks and testing practices (e.g., pytest , data validation libraries, custom assertions). Basic understanding of Unity Catalog for managing data governance, access controls, and lineage tracking from a developer’s perspective. Familiarity with Power BI - able to structure data models and views in Databricks or Synapse to support BI consumption .
Posted 1 week ago
0 years
2 - 9 Lacs
Gurgaon
On-site
Data Engineering Specialist YOE - 7+yrs Role Purpose The purpose of the Data Engineer role is to build and unit test code for projects and programmes on the Azure Cloud Data and Analytics Platform. Key Accountabilities Analyse business requirements and support/create design for requirements Build and deploy new or updated data mappings, sessions, and workflows in the Azure Cloud Platform, with a key focus on Azure Databricks Develop performant and scalable code Perform ETL routines including performance tuning, troubleshooting, support, and capacity estimation Conduct thorough testing of ETL code changes to ensure quality deliverables Provide day-to-day support and mentoring to end users interacting with the data Profile and understand large volumes of source data, including structured and semi-structured/web activity data Analyse defects and provide timely fixes Provide release notes for deployments Support release activities Demonstrate a problem-solving attitude Continuously develop technical skills, especially within the Azure platform Functional / Technical Skills Skills and Experience: Experienced in ETL tools and data projects Recent Azure experience with strong knowledge of Azure Databricks (Python/SQL) Good knowledge of SQL and Python Strong analytical skills Knowledge of Azure DevOps Experience with Azure Databricks and Logic Apps (highly desirable) Experience with Python programming (highly desirable) Experience with Azure Functions (a plus) Decision Making Authority / Impact Responsible for day-to-day decisions related to coding and unit testing Immediate joiners, currently serving notice period upto 1 month notice period If you are interested to apply for this role, send your CV to bhavya.vemuri@invokhr.com
Posted 1 week ago
1.0 - 3.0 years
2 - 7 Lacs
Gurgaon
On-site
Donaldson is committed to solving the world’s most complex filtration challenges. Together, we make cool things. As an established technology and innovation leader, we are continuously evolving to meet the filtration needs of our changing world. Join a culture of collaboration and innovation that matters and a chance to learn, effect change, and make meaningful contributions at work and in communities. We are seeking a skilled and motivated Data Engineer II to join the Corporate Technology Data Engineering Team. This role is important for developing and sustaining our data infrastructure, which supports a wide range of R&D, sensor-based, and modeling technologies. The Data Engineer II will design and maintain pipelines that enable the use of complex datasets. This position directly empowers faster decision making by building trustworthy data flows and access for engineers and scientists. Primary Role Responsibilities: Develop and maintain data ingestion and transformation pipelines across on-premise and cloud platforms. Develop scalable ETL/ELT pipelines that integrate data from a variety of sources (i.e. form-based entries, SQL databases, Snowflake, SharePoint). Collaborate with data scientists, data analysts, simulation engineers and IT personnel to deliver data engineering and predictive data analytics projects. Implement data quality checks, logging, and monitoring to ensure reliable operations. Follow and maintain data versioning, schema evolution, and governance controls and guidelines. Help administer Snowflake environments for cloud analytics. Work with more senior staff to improve solution architectures and automation. Stay updated with the latest data engineering technologies and trends. Participate in code reviews and knowledge sharing sessions. Participate in and plan new data projects that impact business and technical domains. Required Qualifications & Relevant Experience: Bachelor’s or master’s degree in computer science, data engineering, or related field. 1-3 years of experience in data engineering, ETL/ELT development, and/or backend software engineering. Demonstrated expertise in Python and SQL. Demonstrated experience working with data lakes and/or data warehouses (e.g. Snowflake, Databricks, or similar) Familiarity with source control and development practices (e.g Git, Azure DevOps) Strong problem-solving skills and eagerness to work with cross-functional globalized teams. Preferred Qualifications: Required qualification plus Working experience and knowledge of scientific and R&D workflows, including simulation data and LIMS systems. Demonstrated ability to balance operational support and longer-term project contributions. Experience with Java Strong communication and presentation skills. Motivated and self-driven learner Employment opportunities for positions in the United States may require use of information which is subject to the export control regulations of the United States. Hiring decisions for such positions are required by law to be made in compliance with these regulations. Applicants for employment opportunities in other countries must be able to meet the comparable export control requirements of that country and of the United States. Donaldson Company has been made aware that there are several recruiting scams that are targeting job seekers. These scams have attempted to solicit money for job applications and/or collect confidential information, Donaldson will never solicit money during the application or recruiting process. Donaldson only accepts online applications through our Careers | Donaldson Company, Inc. website and any communication from a Donaldson recruiter would be sent using a donaldson.com email address. If you have any questions about the legitimacy of an employment opportunity, please reach out to talentacquisition@donaldson.com to verify that the communication is from Donaldson. Our policy is to provide equal employment opportunities to all qualified persons without regard to race, gender, color, disability, national origin, age, religion, union affiliation, sexual orientation, veteran status, citizenship, gender identity and/or expression, or other status protected by law.
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Any Btech Degree Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications using Microsoft Azure Databricks. Your typical day will involve collaborating with the team to understand business requirements, designing and developing applications, and ensuring the applications meet quality standards and performance expectations. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with the team to understand business requirements and translate them into technical specifications. - Design, develop, and test applications using Microsoft Azure Databricks. - Ensure the applications meet quality standards and performance expectations. - Troubleshoot and debug applications to identify and resolve issues. - Provide technical guidance and support to junior developers. - Stay updated with the latest industry trends and technologies related to application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Microsoft Azure Databricks. - This position is based at our Hyderabad office. - A Any Btech Degree is required. Any Btech Degree
Posted 1 week ago
3.0 years
6 - 10 Lacs
Gurgaon
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Analyze business requirements & functional specifications Be able to determine the impact of changes in current functionality of the system Interaction with diverse Business Partners and Technical Workgroups Be flexible to collaborate with onshore business, during US business hours Be flexible to support project releases, during US business hours Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 3+ years of working experience in Python, Pyspark, Scala 3+ years of experience working on MS Sql Server and NoSQL DBs like Cassandra, etc. Hands-on working experience in Azure Databricks Solid healthcare domain knowledge Exposure to following DevOps methodology and creating CI/CD deployment pipeline Exposure to following Agile methodology specifically using tools like Rally Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization Proven excellent analytical and communication skills (Both verbal and written) Preferred Qualification: Experience in the Streaming application (Kafka, Spark Streaming, etc.) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #Gen #NJP
Posted 1 week ago
0 years
4 - 6 Lacs
Gurgaon
On-site
The team works on data collected from different business domains, such as, Rewards, Health, Assessments, HR, McLagan, Radford, and continue to explore avenues for the creation of new analytical products that can become future revenue streams. The team uses advanced applications along with statistical/data-science know-how to do hypothesis testing, prescriptive analysis and predictive analytics on key business/client questions using tools like Python, SQL, Databricks. Insights are reported using visualization tools like Tableau, Power BI, Power Point, or VBA-powered Excel dashboards. The team also uses its skills to support consulting studies/products serviced under the HCS People and Performance Analytics banner. Work Description/Role Summary: A colleague as part of the Rewards Innovation Team may get unique opportunities to work for Aon Clients & worldwide leaders, and contribute towards Aon Strategy. At the same time, utilizing own skills set, one can touch multiple lives across the globe, bring impactful solutions in multiple domains and transform business. It is a unique opportunity to be part of a small team and get visibility amongst Aon Leadership of both onshore and offshore. Nevertheless, the opportunity will open doors with work on the advanced analytics (such as, NLP (text analytics), Machine Learning, etc.) Role overview: A colleague must be adaptive of acquiring new skills within a short time period based on the business needs. One must have analytical mindset to perform data analysis and data visualization on large data sets. Skills of Problem solving and automation using coding will help a colleague to explore new opportunities, bring efficiency and innovative solutions. The ability to present to senior leaders and create client ready reports are needed. Colleague should have the knowledge of creating SQL queries and perform other database operations (such as, creating views and procedures). Skills & Knowledge Requirements: Basic knowledge of VBA, Tableau/Power BI or Python are must. Intermediate knowledge of Statistics, SQL, and MS applications are required. Knowledge of any programming language is preferred. Good communication skills, Logical reasoning, Analytical Abilities, Data Interpretation Mandatory Skills Needs to be comfortable working in an un-structured environment since this role requires working with exploratory projects which do not have clear-cut mode-of-operation or defined start & end objectives. Need a never-give-up attitude since the research with require experimenting with multiple failed approaches until the right successful approach is discovered. High Data Acumen with good eye for detail and keen sense to seek out patterns. Logical reasoning ability is required with a drive for problem solving. Ability to understand complex requirements and relay complex thoughts across teams is a must. 2563945
Posted 1 week ago
2.0 years
8 - 10 Lacs
Gurgaon
On-site
About this role: Join our dynamic and forward-thinking team within Gartner's Global Strategy and Operations (GSO) division, where innovation blends with impactful results. Our Service Analytics & Productivity team is at the forefront of developing cutting-edge automated data solutions for Gartner’s Global Services & Delivery team. We leverage data to uncover transformative insights and strategies, enhancing productivity and boosting client retention. This is your chance to be part of a high-impact analytics team, dedicated to driving automation, solving complex problems, and managing key stakeholder relationships, ultimately delivering significant and measurable business outcomes. What you’ll do: Business Intelligence Development: Design, develop, and maintain robust business intelligence solutions using Power BI. Ensure these solutions are scalable and stable to support the evolving needs of the business. Problem-Solving: Independently tackle complex data challenges to create innovative solutions that push the boundaries of conventional thinking. Stakeholder Relationships: Build and nurture strong relationships with stakeholders by understanding and rationalizing automation requirements. Define success clearly and deliver high-quality automated solutions that meet stakeholder expectations. Communication: Effectively communicate project status and challenges to leaders and stakeholders, simplifying complex technical concepts for easy understanding. Data Quality: Uphold the highest standards of data quality and protection. Ensure adherence to control structures that guarantee data accuracy and quality across all data channels, fostering trust and reliability in our data-driven decisions. Ethical Standards & Teamwork: Maintain the highest ethical standards while fostering a culture of teamwork and collaboration, contributing to a positive and productive work environment. What you’ll need: Educational Background: Possess 2+ years of professional experience with a degree in Engineering, Math, Statistics, or related fields. Your academic foundation will be complemented by a passion for data analytics and innovation. SQL Proficiency: Demonstrate proficient SQL skills for data extraction and manipulation, enabling the creation of innovative data solutions. Data Visualization: Experience with data visualization techniques and tools for impactful storytelling through dashboards, with a primary focus on Power BI. Python: Preferred experience in Python and essential libraries such as NumPy and Pandas, with a track record of creating efficient ETL processes, preferably in Databricks. Problem-Solving Skills: Possess a knack for creative problem-solving, with sharp qualitative and quantitative abilities and a keen eye for detail and accuracy. Communication Skills: Exhibit strong written and verbal communication skills, with the ability to convey technical concepts to a non-technical audience effectively. What you’ll get: In addition to an outstanding work environment with rapid advancement potential, Gartner associates enjoy exceptional compensation and benefits, including: Competitive base salary Flexible work environment A great work culture #L1-AV2 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:100643 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.
Posted 1 week ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role Title: Business Analytics Associate Advisor About Evernorth Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Position Summary The job profile for this position is Business Analytics Associate Advisor, which is a Band 3 Contributor Career Track Role. The Cigna Enterprise Operations Analytics organization offers solutions that provide actionable insights to internal and external business partners and customers that help improve customer experience, reduce cost, measure and forecast business performance, and improve process and procedures. The Business Analytics Associate Advisor will be responsible for creating a detailed business analysis, outlining problems, opportunities and solutions for their respective client group. The candidate should have proficiency with assembling the data to tell a story and present the findings in a creative and insightful way to top leadership / management. This candidate will have the ability to provide thought leadership and technical expertise cross multiple disciplines. Job Description & Responsibilities Reporting on productivity and project progress and ensuring that they are compliant with quality standards. Using Databricks and SQL for reporting and analytics, to write queries to answer questions and perform ETL tasks to create datasets. Maintaining both internal and external channels of communication. Utilizing Python libraries (scikit-learn, pandas, numpy) to conduct statistical analyses. Gathering details regarding the business of the operations area by using a variety of methods (shadowing interviews, shadowing, surveys or reading reports, etc.). Performing statistical tests such as k-means, OLS and MLS regressions, and logistic regressions. Working with stakeholders to scope and plan projects and analysis topics. Providing findings and data driven recommendations to leadership. Writing tests and logging for data pipelines and automation. Experience Required 8+ years of relevant analytics experience Experience Desired Experience as a Business Analytics Associate Advisor is a plus. Expertise in health insurance contact center operations. Experience with Business Intelligence Software (Tableau, Power BI, Looker, etc.). Additional Skills Education and Training Required: Excellent verbal, written and interpersonal communication skills a must. Problem-solving, consulting skills, teamwork, leadership, and creativity skills a must. Analytical mind with outstanding ability to collect and analyze data. Action Oriented Business Insight Instills Trust Manages Complexity Nimble Learning Persuades Plans and Aligns About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.
Posted 1 week ago
0 years
4 - 8 Lacs
Calcutta
On-site
Ready to build the future with AI? At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Assistant Manager , Senior Data Engineer! – Agentic AI! In this role, you'll be part of Genpact's transformation under GenpactNext , as we lead the shift to Agentic AI Solutions—domain-specific, autonomous systems that redefine how we deliver value to clients. You'll help drive the adoption of innovations like the Genpact AP Suite in finance and accounting, with more Agentic AI products set to expand across service lines. Responsibilities Responsible for development based on client prioritization and approval • Support client technical architects in conducting PoCs , including cost evaluations, of various AWS services and other tools • Develop data ingress frameworks using the AWS tech stack like AWS Glue, Athena, Databricks, Python, Pyspark , SQL • Design and development of the Workflo w Orchestration using Airflow, step functions • Manage releases • Oversee testing • Ensure data pipelines meet intraday and daily SLAs, as per documented SLA definitions • Ensure data quality through building data quality frameworks • Oversee incident management & request fulfillment • Ensure SLAs are met, as per documented SLA definitions • Manage escalations and approvals Qualifications we seek in you! Minimum qualifications • Bachelor’s degree in business information systems (IS), computer science or related field, or equivalent-related IT experience. • AWS certified Cloud Data Engineer • Databricks certified Data Engineer • Have practical experience in Python, PySpark , SQL Preferred Qualifications/skills Good Communication skills • Able to interact with client independently Why join Genpact? Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career —Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up . Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Assistant Manager Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 17, 2025, 11:59:36 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 1 week ago
8.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Position Overview This role is responsible for defining and delivering ZURU’s next-generation data architecture—built for global scalability, real-time analytics, and AI enablement. You will lead the unification of fragmented data systems into a cohesive, cloud-native platform that supports advanced business intelligence and decision-making. Sitting at the intersection of data strategy, engineering, and commercial enablement, this role demands both deep technical acumen and strong cross-functional influence. You will drive the vision and implementation of robust data infrastructure, champion governance standards, and embed a culture of data excellence across the organisation. Position Impact In the first six months, the Head of Data Architecture will gain deep understanding of ZURU’s operating model, technology stack, and data fragmentation challenges. You’ll conduct a comprehensive review of current architecture, identifying performance gaps, security concerns, and integration challenges across systems like SAP, Odoo, POS, and marketing platforms. By month twelve, you’ll have delivered a fully aligned architecture roadmap—implementing cloud-native infrastructure, data governance standards, and scalable models and pipelines to support AI and analytics. You will have stood up a Centre of Excellence for Data, formalised global data team structures, and established yourself as a trusted partner to senior leadership. What are you Going to do? Lead Global Data Architecture: Own the design, evolution, and delivery of ZURU’s enterprise data architecture across cloud and hybrid environments. Consolidate Core Systems: Unify data sources across SAP, Odoo, POS, IoT, and media into a single analytical platform optimised for business value. Build Scalable Infrastructure: Architect cloud-native solutions that support both batch and streaming data workflows using tools like Databricks, Kafka, and Snowflake. Implement Governance Frameworks: Define and enforce enterprise-wide data standards for access control, privacy, quality, security, and lineage. Enable Metadata & Cataloguing: Deploy metadata management and cataloguing tools to enhance data discoverability and self-service analytics. Operationalise AI/ML Pipelines: Lead data architecture that supports AI/ML initiatives, including demand forecasting, pricing models, and personalisation. Partner Across Functions: Translate business needs into data architecture solutions by collaborating with leaders in Marketing, Finance, Supply Chain, R&D, and Technology. Optimize Cloud Cost & Performance: Roll out compute and storage systems that balance cost efficiency, performance, and observability across platforms. Establish Data Leadership: Build and mentor a high-performing data team across India and NZ, and drive alignment across engineering, analytics, and governance. Vendor and Tool Strategy: Evaluate external tools and partners to ensure the data ecosystem is future-ready, scalable, and cost-effective. What are we Looking for? 8+ years of experience in data architecture, with 3+ years in a senior or leadership role across cloud or hybrid environments Proven ability to design and scale large data platforms supporting analytics, real-time reporting, and AI/ML use cases Hands-on expertise with ingestion, transformation, and orchestration pipelines (e.g. Kafka, Airflow, DBT, Fivetran) Strong knowledge of ERP data models, especially SAP and Odoo Experience with data governance, compliance (GDPR/CCPA), metadata cataloguing, and security practices Familiarity with distributed systems and streaming frameworks like Spark or Flink Strong stakeholder management and communication skills, with the ability to influence both technical and business teams Experience building and leading cross-regional data teams Tools & Technologies Cloud Platforms: AWS (S3, EMR, Kinesis, Glue), Azure (Synapse, ADLS), GCP Big Data: Hadoop, Apache Spark, Apache Flink Streaming: Kafka, Kinesis, Pub/Sub Orchestration: Airflow, Prefect, Dagster, DBT Warehousing: Snowflake, Redshift, BigQuery, Databricks Delta NoSQL: Cassandra, DynamoDB, HBase, Redis Query Engines: Presto/Trino, Athena IaC & CI/CD: Terraform, GitLab Actions Monitoring: Prometheus, Grafana, ELK, OpenTelemetry Security/Governance: IAM, TLS, KMS, Amundsen, DataHub, Collibra, DBT for lineage What do we Offer? 💰 Competitive compensation 💰 Annual Performance Bonus ⌛️ 5 Working Days with Flexible Working Hours 🚑 Medical Insurance for self & family 🚩 Training & skill development programs 🤘🏼 Work with the Global team, Make the most of the diverse knowledge 🍕 Several discussions over Multiple Pizza Parties A lot more! Come and discover us!
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi