Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Bengaluru
On-site
Work Schedule Standard (Mon-Fri) Environmental Conditions Office Job Description About the Company Thermo Fisher Scientific Inc. is the world leader in serving science, with revenues of more than $40 billion and approximately 1,20,000 employees globally. Our Mission is to enable our customers to make the world healthier, cleaner, and safer. We help our customers accelerate life sciences research, solve sophisticated analytical challenges, improve patient diagnostics, deliver medicines to market and increase laboratory efficiency. About Team AAD team aids businesses in Thermo Fisher Scientific with AI, Automation, and Data solutions for efficiency and insights. What You Will Do: Develop and implement secure, scalable data solutions using AWS and Databricks, ensuring flawless integration across enterprise data layers. Establish data governance and security standard methodologies, lead adoption of cloud-based architectures, and collaborate with teams to drive innovation in data integration and self-service analytics. How You Will Get Here: BS/MS in Computer Science, Information Systems, or equivalent experience. 9 or more years of industry experience in data architecture, data engineering, or cloud-based data solutions. 4 or more years of demonstrated ability in solutions, including AWS S3, Redshift, Glue, Lambda, EMR, Kinesis, Athena, PySpark, Python, and SQL for data processing, ETL/ELT data transformation Good experience in Databricks, building scalable data pipelines, optimizing Delta Lake storage, and implementing security controls. Expertise in AWS services, cloud security standard processes, IAM roles and policies, encryption strategies, and compliance frameworks (e.g., GDPR, HIPAA). Experience in working on initiatives related to AI and Data Good knowledge of data governance frameworks, including AWS Lake Formation, Unity Catalog, and data lineage tracking. Proficiency in CI/CD practices for data solutions using Github, GHA and Terraform. Experience implementing real-time and batch data processing solutions using Kafka, Kinesis, or Spark Streaming. Strong problem-solving skills with experience in debugging, performance tuning, and cost optimization in cloud environments. Non-Technical Qualifications: Strong leadership skills with the ability to influence technical decisions and guide teams toward scalable data solutions. Outstanding interpersonal skills to translate sophisticated technical concepts into business-aligned solutions. Proven ability to work cross-functionally with collaborators across IT, business, and security teams. Experience in writing technical documentation, architecture diagrams, and standard methodology guides. Passion for continuous learning, innovation, and staying ahead of evolving cloud and data technologies. Our Mission is to enable our customers to make the world healthier, cleaner, and safer. Thermo Fisher Scientific is an EEO/Affirmative Action Employer and does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability or any other legally protected status.
Posted 4 days ago
5.0 years
3 - 6 Lacs
Bengaluru
On-site
Location Bangalore, Karnataka, 560048 Category Engineering Job Type Full time Job Id 1189604 No Marketing Data Foundation - Engineer This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: Our HPE marketing teams are focused on making our brand easy to understand and easy to buy. We’re devising data driven brand strategy, from architecture and naming, to why we exist, what we do, who we are and how we look, feel and sound. As a team, we are retaining and attracting new customers by accelerating our business strategy by making our brand stronger, more relevant, differentiated, and authentic. Responsibilities: Combines technical depth in big data and cloud technologies with business acumen and solution architect skills to design, implement, and operationalize data solutions that deliver business value. Act as trusted advisors, bridging the gap between stakeholders and technical teams while fostering community engagement and continuous innovation. Education and Experience Required: Master´s degree in Statistics, Operations Research, Computer Science or equivalent preferred. Or Bachelor´s Degree in these areas. At least 5-8 years of relevant experience. Knowledge and Skills: Architect big data solutions that span data engineering, data science, machine learning, and SQL analytics workflows. Deep expertise in areas such as streaming, performance tuning, data lake technologies, or industry-specific data solutions Designing scalable, secure, and optimized data solutions using the Databricks Lakehouse Platform. Deep understanding of the Databricks Lakehouse architecture, including Delta Lake, MLflow, and Databricks SQL for data management, machine learning, and analytics Proficiency in deploying and managing Databricks solutions on Azure Strong skills in building and optimizing data pipelines using ETL/ELT processes, data modeling, schema design, and handling large-scale data processing with Apache Spark. Ability to design scalable and secure Databricks solutions tailored to business needs, including multi-hop data pipelines (Bronze, Silver, Gold layers) Implementing data governance using Unity Catalog, role-based access control (RBAC), entity permissions, and ensuring compliance and data security Additional Skills: Accountability, Accountability, Action Planning, Active Learning (Inactive), Active Listening, Agile Methodology, Agile Scrum Development, Analytical Thinking, Bias, Coaching, Creativity, Critical Thinking, Cross-Functional Teamwork, Data Analysis Management, Data Collection Management (Inactive), Data Controls, Design, Design Thinking, Empathy, Follow-Through, Group Problem Solving, Growth Mindset, Intellectual Curiosity (Inactive), Long Term Planning, Managing Ambiguity {+ 5 more} What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #marketing Job: Engineering Job Level: TCP_05 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.
Posted 4 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Analytical Wizards is part of the Definitive Healthcare family. We balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their challenges, achieving outcomes that benefit both business and society. Our people are our biggest asset, they drive our innovation advantage and we strive to offer a flexible and collaborative workplace where they can thrive. We offer industry-leading benefits packages to promote a creative and inclusive culture. If driving real change gives you a sense of pride and you are passionate about powering social good, we’d love to hear from you. Role : Senior Data Analyst – Professional Services Office Location : Bangalore Job Description We are seeking a Senior Data Analyst to join our Professional Services team in Bangalore. This role involves developing custom reports, performing complex data analysis, and supporting client-specific data needs. The ideal candidate will have expertise in SQL, data manipulation, and healthcare analytics with strong problem-solving skills and the ability to work cross-functionally with internal teams and clients. Key Responsibilities Data Analysis & Reporting Develop and deliver custom data extracts and reports using SQL, Excel, and Python. Analyze large-scale healthcare datasets to provide actionable insights. Ensure data integrity, accuracy, and quality assurance in all deliverables. Client & Cross-functional Collaboration Work closely with Product, Customer Success, and Engineering teams to deliver client-specific solutions. Engage directly with clients via web conferences to discuss data requirements, methodologies, and insights. Support data integration projects, ensuring smooth implementation and validation. Technical Expertise & Innovation Optimize SQL queries and stored procedures for efficiency and scalability. Serve as a technical point of contact for client data-related questions and integrations. Train internal team members on SQL best practices and healthcare analytics methodologies. Thought Leadership & Training Present market trends and analytical use cases to internal teams and clients. Conduct knowledge-sharing sessions to enhance the team's expertise in data analysis and reporting. Contributes to the development of standardized reporting templates and methodologies. Required Qualifications & Experience Education: Bachelor’s degree in a quantitative or healthcare-related field (e.g., Computer Science, Healthcare Informatics, Data Analytics). Experience: 3+ years in data analysis, report building, or research in a professional setting. 3+ years of hands-on SQL experience, including developing queries, views, and stored procedures. Strong understanding of relational database principles and healthcare data structures. Experience working with Real World Evidence (RWE), medical claims, and EHR data. Skills: Strong analytical and problem-solving abilities. High attention to detail and a quality assurance mindset. Ability to communicate complex data findings to both technical and non-technical audiences. Self-starter with the ability to manage multiple priorities effectively. Preferred Skills (Good To Have) Experience with Databricks, Snowflake, or other cloud-based analytics platforms. Knowledge of Python or R for data manipulation and automation. Exposure to data visualization tools like Tableau or Power BI. Prior experience in professional services, client-facing analytics, or data consulting roles. This is a hybrid role, requiring at least three days in the office per week to collaborate effectively with teams and clients.
Posted 4 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities About the Role: We are hiring sharp, hands-on Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do: Build and manage high-performance data pipelines for batch and near real-time use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over large-scale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multi-step workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory Skill Sets ‘Must have’ knowledge, skills and experiences 5+ years of hands-on experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred Skill Sets ‘Good to have’ knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to on-prem to cloud data migrations Knowledge of Git-based workflows and CI/CD pipelines Years Of Experience Required Experience 5-8 years Education Qualification BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 4 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview We are seeking a skilled Associate Manager - AIOps & MLOps Operations to support and enhance the automation, scalability, and reliability of AI/ML operations across the enterprise. This role requires a solid understanding of AI-driven observability, machine learning pipeline automation, cloud-based AI/ML platforms, and operational excellence. The ideal candidate will assist in deploying AI/ML models, ensuring continuous monitoring, and implementing self-healing automation to improve system performance, minimize downtime, and enhance decision-making with real-time AI-driven insights. Support and maintain AIOps and MLOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in implementing real-time data observability, monitoring, and automation frameworks to enhance data reliability, quality, and operational efficiency. Contribute to developing governance models and execution roadmaps to drive efficiency across data platforms, including Azure, AWS, GCP, and on-prem environments. Ensure seamless integration of CI/CD pipelines, data pipeline automation, and self-healing capabilities across the enterprise. Collaborate with cross-functional teams to support the development and enhancement of next-generation Data & Analytics (D&A) platforms. Assist in managing the people, processes, and technology involved in sustaining Data & Analytics platforms, driving operational excellence and continuous improvement. Support Data & Analytics Technology Transformations by ensuring proactive issue identification and the automation of self-healing capabilities across the PepsiCo Data Estate. Responsibilities Support the implementation of AIOps strategies for automating IT operations using Azure Monitor, Azure Log Analytics, and AI-driven alerting. Assist in deploying Azure-based observability solutions (Azure Monitor, Application Insights, Azure Synapse for log analytics, and Azure Data Explorer) to enhance real-time system performance monitoring. Enable AI-driven anomaly detection and root cause analysis (RCA) by collaborating with data science teams using Azure Machine Learning (Azure ML) and AI-powered log analytics. Contribute to developing self-healing and auto-remediation mechanisms using Azure Logic Apps, Azure Functions, and Power Automate to proactively resolve system issues. Support ML lifecycle automation using Azure ML, Azure DevOps, and Azure Pipelines for CI/CD of ML models. Assist in deploying scalable ML models with Azure Kubernetes Service (AKS), Azure Machine Learning Compute, and Azure Container Instances. Automate feature engineering, model versioning, and drift detection using Azure ML Pipelines and MLflow. Optimize ML workflows with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics for data preparation and ETL/ELT automation. Implement basic monitoring and explainability for ML models using Azure Responsible AI Dashboard and InterpretML. Collaborate with Data Science, DevOps, CloudOps, and SRE teams to align AIOps/MLOps strategies with enterprise IT goals. Work closely with business stakeholders and IT leadership to implement AI-driven insights and automation to enhance operational decision-making. Track and report AI/ML operational KPIs, such as model accuracy, latency, and infrastructure efficiency. Assist in coordinating with cross-functional teams to maintain system performance and ensure operational resilience. Support the implementation of AI ethics, bias mitigation, and responsible AI practices using Azure Responsible AI Toolkits. Ensure adherence to Azure Information Protection (AIP), Role-Based Access Control (RBAC), and data security policies. Assist in developing risk management strategies for AI-driven operational automation in Azure environments. Prepare and present program updates, risk assessments, and AIOps/MLOps maturity progress to stakeholders as needed. Support efforts to attract and build a diverse, high-performing team to meet current and future business objectives. Help remove barriers to agility and enable the team to adapt quickly to shifting priorities without losing productivity. Contribute to developing the appropriate organizational structure, resource plans, and culture to support business goals. Leverage technical and operational expertise in cloud and high-performance computing to understand business requirements and earn trust with stakeholders. Qualifications 5+ years of technology work experience in a global organization, preferably in CPG or a similar industry. 5+ years of experience in the Data & Analytics field, with exposure to AI/ML operations and cloud-based platforms. 5+ years of experience working within cross-functional IT or data operations teams. 2+ years of experience in a leadership or team coordination role within an operational or support environment. Experience in AI/ML pipeline operations, observability, and automation across platforms such as Azure, AWS, and GCP. Excellent Communication: Ability to convey technical concepts to diverse audiences and empathize with stakeholders while maintaining confidence. Customer-Centric Approach: Strong focus on delivering the right customer experience by advocating for customer needs and ensuring issue resolution. Problem Ownership & Accountability: Proactive mindset to take ownership, drive outcomes, and ensure customer satisfaction. Growth Mindset: Willingness and ability to adapt and learn new technologies and methodologies in a fast-paced, evolving environment. Operational Excellence: Experience in managing and improving large-scale operational services with a focus on scalability and reliability. Site Reliability & Automation: Understanding of SRE principles, automated remediation, and operational efficiencies. Cross-Functional Collaboration: Ability to build strong relationships with internal and external stakeholders through trust and collaboration. Familiarity with CI/CD processes, data pipeline management, and self-healing automation frameworks. Strong understanding of data acquisition, data catalogs, data standards, and data management tools. Knowledge of master data management concepts, data governance, and analytics.
Posted 4 days ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
P-1346 At Databricks, we are passionate about enabling data teams to solve the world's toughest problems — from making the next mode of transportation a reality to accelerating the development of medical breakthroughs. We do this by building and running the world's best data and AI infrastructure platform so our customers can use deep data insights to improve their business. Founded by engineers — and customer obsessed — we leap at every opportunity to solve technical challenges, from designing next-gen UI/UX for interfacing with data to scaling our services and infrastructure across millions of virtual machines. Databricks Mosaic AI offers a unique data-centric approach to building enterprise-quality, Machine Learning and Generative AI solutions, enabling organizations to securely and cost-effectively own and host ML and Generative AI models, augmented or trained with their enterprise data. And we're only getting started in Bengaluru , India - and currently in the process of setting up 10 new teams from scratch ! The Money team's mission at Databricks is to maximize the value that our customers derive from their investments in data projects. We accomplish this through innovative commercialization strategies, timely & accurate billing, cost optimization tools, intelligent resource usage controls, and cutting-edge engineering. We provide a seamless and consistent experience set of platforms for all Databricks products to reach customers. As one of the first engineers for Money at Databricks India , you will be key to building a base for one of Databricks’ most central engineering teams . You will own critical components that form the backbone of our products, starting with Databricks’ resource admission control and usage governance infrastructure . Your role is crucial in helping bring diverse business needs together, including abuse prevention, product commercialization motions, and reliable product availability at scale. You will work closely with infrastructure as well as product teams in bringing critical governance functionality to Databricks customers. The Impact You Will Have Own Money systems and services that govern usage of all Databricks products and offerings. Enhance engineering and infrastructure efficiency, reliability, accuracy, and response times, including CI/CD processes, test frameworks, data quality assurance, end-to-end reconciliation, and anomaly detection. Collaborate with platform and product teams to develop and implement innovative infrastructure that scales to meet evolving needs. Contribute to long-term vision and requirements development for Databricks products, in partnership with our engineering teams. What We Look For BS (or higher) in Computer Science, or a related field 7+ years of production level experience in one of: Java, Scala, C++, or similar language Comfortable working towards a multi-quarter vision with incremental deliverables Proven track record in architecting, developing, deploying, and operating components of large scale distributed systems Experience with software security and systems that handle sensitive data Demonstrated ability to lead engineering projects across functional and organizational boundaries A proactive approach and a passion for delivering high-quality solutions Experience with cloud technologies, e.g. AWS, Azure, GCP, Docker, Kubernetes About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 4 days ago
0.0 - 2.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Data Engineer (SQL & Python Expertise Required) 📅 Experience: 0-2 Years 🏢 Company: Cephei Infotech 💼 Job Type: Full-Time About Us Cephei Infotech is a leading Data & Analytics-focused IT company , delivering scalable data engineering, business intelligence, and AI-driven solutions . We help businesses unlock the power of data through robust pipelines, real-time analytics, and AI/ML innovations. Role Overview We are looking for a Data Engineer with 0-2 years of experience , proficient in SQL and Python , to help build and optimize data pipelines, ETL workflows, and analytical solutions . Exposure to GenAI, OpenCV, or Machine Learning is a great plus ! Key Responsibilities ✅ Design, develop, and optimize ETL pipelines for structured and unstructured data ✅ Write efficient SQL queries for data extraction, transformation, and reporting ✅ Work with Python to automate data processing and build scalable solutions ✅ Collaborate with Data Scientists & Analysts to support AI/ML model deployment ✅ Integrate data from multiple sources (databases, APIs, cloud storage) ✅ Ensure data quality, integrity, and security across all pipelines ✅ Explore and experiment with GenAI, OpenCV, or ML frameworks (if applicable) Required Skills & Qualifications 🔹 0-2 years of experience in Data Engineering 🔹 Strong hands-on experience with SQL (MySQL, PostgreSQL, or similar) 🔹 Proficiency in Python for data processing and automation 🔹 Understanding of ETL concepts, data warehousing, and cloud platforms 🔹 Exposure to GenAI, OpenCV, or Machine Learning is a plus 🔹 Knowledge of Big Data tools (Spark, Snowflake, Databricks, etc.) is a bonus 🔹 Strong problem-solving skills and ability to work in a fast-paced environment Why Join Us? 🚀 Work in a fast-growing IT company specializing in Data & AI 📊 Exposure to cutting-edge analytics and AI-driven projects 🌍 Collaborate with global clients & enterprise-scale data systems 📈 Career growth opportunities in Data Engineering, AI, and ML
Posted 4 days ago
4.0 - 6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role As a BI Data Engineer at Kyndryl, you'll be at the forefront of the design, data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights for business users that drive strategic decisions and innovation. Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a BI Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation. Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset—a true data alchemist. Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made – and your lifecycle management expertise will ensure our data remains fresh and impactful. Key Responsibilities Dashboard and Report Creation: Developing interactive, visually appealing, and insightful dashboards and reports using BI tools like Power BI, Tableau, Qlik, Tableau. Self-Service Analytics Enablement: Empowers business users with self-service analytics capabilities by creating curated datasets and intuitive dashboards in BI, often sourcing data from Fabric, Cloudera and Databricks. ETL/ELT Tools: Proficiency with tools that automate data integration, such as Airflow, Ni-Fi, Azure Data Factory. Data Modelling & Warehousing: Designs and implements datasets, stored procedures, views in data warehouses within MS Fabric's, Synapse, Cloudera etc to support efficient and scalable reporting. BI Platform Modernization: Leads the migration of reporting and analytics workloads from legacy systems. Data Quality and Governance: Implementing processes to ensure data is accurate, consistent, and reliable. This includes data validation, cleaning, and documenting data lineage. Performance Optimization: Monitoring and tuning the performance of queries, data pipelines, and dashboards to ensure they run efficiently. Provide User Training & Adoption: Drives the adoption of new BI tools and platforms through user training, documentation, and evangelism within the organization. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Expertise 4-6 years of experience as an Business Intelligence (BI) Engineer SQL: Expert-level proficiency in SQL BI & Visualization Tools: Deep experience with at least one major platform like Microsoft Power BI, Tableau, Qlik. Data Warehousing: Hands-on experience with modern cloud data warehouses such as Cloudera, MS Fabric, Datawarehouse ETL/ELT Tools: Proficiency with tools that automate data integration, such as dbt (Data Build Tool), Ni-Fi, Airflow, or Azure Data Factory. Data Modelling: Understanding of data modelling techniques (e.g., Kimball, Inmon) and concepts like dimensional modelling. Programming (Python): Knowledge of Python Preferred Technical And Professional Experience Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 4 days ago
7.0 - 12.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips – the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world – like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries of science and engineering to make possible the next generations of technology, join us to Make Possible® a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. We’re committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits. You’ll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers. We empower our team to push the boundaries of what is possible—while learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities Supports the design and development of program methods, processes, and systems to consolidate and analyze structured and unstructured, diverse "big data" sources. Interfaces with internal customers for requirements analysis and compiles data for scheduled or special reports and analysis Supports project teams to develop analytical models, algorithms and automated processes, applying SQL understanding and Python programming, to cleanse, integrate and evaluate large datasets. Supports the timely development of products for manufacturing and process information by applying sophisticated data analytics. Able to quickly understand the requirement and create it into executive level presentation slides. Participates in the design, development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. used to drive key business decisions. Strong business & financial (P&L) acumen. Able to understand key themes, financial terms and data points to create appropriate summaries. Works with business intelligence manager and other staff to assess various reporting needs. Analyzes reporting needs and requirements, assesses current reporting in the context of strategic goals and devise plans for delivering the most appropriate reporting solutions to users. Qualification Bachelors/Master’s degree or relevant 7 - 12 years of experience as data analyst Required technical skills in SQL, Azure, Python, Databricks, Tableau (good to have) PowerPoint and Excel expertise Experience in Supply Chain domain. Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and basic knowledge of related disciplines. Business Expertise Has knowledge of best practices and how own area integrated with others; is aware of the competition and the factors that differentiate them in the market. Leadership Acts as a resource for colleagues with less experience; may lead small projects with manageable risks and resource requirements. Problem Solving Solves complex problems; takes a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information. Impact Impacts a range of customer, operational, project or service activities within own team and other related teams; works within broad guidelines and policies. Interpersonal Skills Explains difficult or sensitive information; works to build consensus. Additional Information Time Type: Full time Employee Type Assignee / Regular Travel Yes, 20% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 4 days ago
0.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Role: Assistant Manager Experience: 5 to 8 years Location: Chennai Job Description: We are looking for a highly skilled and self-driven Senior Consultant – Power BI who can independently manage and deliver end-to-end Power BI solutions. The ideal candidate will be responsible for understanding business requirements, designing scalable data models, building insightful dashboards and reports, and ensuring seamless integration with various data sources. This role demands strong expertise in data visualization, storytelling with data, and the ability to work collaboratively with cross-functional teams while driving business value through actionable insights. Job Responsibilities: Here's a more detailed breakdown of the Power BI Senior consultant responsibilities: 1. Design and Implementation: Data Modeling: Creating data models, establishing relationships between tables, and ensuring data accuracy and consistency. ETL (Extract, Transform, Load) Processes: Designing and implementing processes for extracting data from various sources, transforming it, and loading it into Power BI. Reporting and Visualization: Developing visual reports, dashboards, and KPI scorecards using Power BI Desktop. Security: Implementing data-level security and object-level security to protect sensitive data. Self-Service BI: Establishing models and configurations to support self-service BI for users to create their own reports and insights. 2. Development and Maintenance: DAX (Data Analysis Expressions): Creating DAX calculations and resolving issues related to loops and hierarchies. Performance Tuning: Optimizing Power BI reports and dashboards for performance and scalability. Troubleshooting: Resolving issues related to data loading, model integrity, and report functionality. Integration: Integrating Power BI reports into other applications using embedded analytics or API automation. Documentation: Documenting data models, ETL processes, and report designs for future reference. 3. Collaboration and Leadership: Stakeholder Management: Collaborating with stakeholders to understand their needs, gather requirements, and align solutions with business objectives. Team Leadership: Leading Power BI projects, managing timelines, and ensuring successful delivery. Knowledge Sharing: Mentoring junior team members and promoting knowledge sharing within the organization. Continuous Improvement: Staying updated on emerging technologies and trends in the data analytics and visualization space. 4. Technical Expertise: Power BI Tools: Proficiency in using Power BI Desktop, Power BI Service, and other related tools. SQL: Strong knowledge of SQL and other data manipulation languages. Skills Required: SQL ,Python, Databricks , Powerbi Cloud services and databricks Job Snapshot Updated Date 30-06-2025 Job ID J_3724 Location Chennai, Tamil Nadu, India Experience 7 - 10 Years Employee Type Permanent
Posted 4 days ago
5.0 years
0 Lacs
Greater Kolkata Area
Remote
Requirements 5+ years of experience in DevOps. Proficient with Azure (compute, storage, networking, AKS) and AWS services. Strong hands-on with Terraform, Ansible, and Kubernetes. Experience with Argo CD, Helm, Traefik, and Consul. Solid understanding of CI/CD using Azure DevOps, GitHub, or GitLab. Familiarity with Databricks integration and management. Monitoring and observability experience with Dynatrace. Strong scripting skills (e.g., Bash, Python, PowerShell). Roles And Responsibilities CI/CD Pipeline Development : Design, implement, and maintain scalable CI/CD pipelines using GitHub Actions, GitLab CI, or Azure DevOps to automate build, test, and deployment processes. Infrastructure as Code (IaC) : Utilize Terraform or Python scripting to automate infrastructure provisioning and management on Azure Cloud. Containerization & Orchestration : Deploy and manage containerized applications using Kubernetes (AKS), ensuring high availability and scalability. GitOps Implementation : Implement and manage GitOps workflows using ArgoCD to automate application deployments and maintain configuration consistency. Monitoring & Alerting : Set up and maintain monitoring and alerting systems using tools like Dynatrace, Prometheus, Grafana, or Azure Monitor to ensure system reliability and performance. Incident Management : Participate in on-call rotations to respond to and resolve production incidents promptly. Collaboration : Work closely with development and operations teams to integrate DevOps best practices and ensure smooth application delivery. About US We turn customer challenges into growth opportunities. Material is a global strategy partner to the worlds most recognizable brands and innovative companies. Our people around the globe thrive by helping organizations design and deliver rewarding customer experiences. We use deep human insights, design innovation and data to create experiences powered by modern technology. Our approaches speed engagement and growth for the companies we work with and transform relationships between businesses and the people they serve. Srijan, a Material company, is a renowned global digital engineering firm with a reputation for solving complex technology problems using their deep technology expertise and leveraging strategic partnerships with top-tier technology partners. Be a part of an Awesome Tribe Why work for Material In addition to fulfilling, high-impact work, company culture and benefits are integral to determining if a job is a right fit for you. Heres a bit about who we are and highlights around What we offer. Who We Are & What We Care About Material is a global company and we work with best-of-class brands worldwide. We also create and launch new brands and products, putting innovation and value creation at the center of our practice. Our clients are in the top of their class, across industry sectors from technology to retail, transportation, finance and healthcare. Material employees join a peer group of exceptionally talented colleagues across the company, the country, and even the world. We develop capabilities, craft and leading-edge market offerings across seven global practices including strategy and insights, design, data & analytics, technology and tracking. Our engagement management team makes it all hum for clients. We prize inclusion and interconnectedness. We amplify our impact through the people, perspectives, and expertise we engage in our work. Our commitment to deep human understanding combined with a science & systems approach uniquely equips us to bring a rich frame of reference to our work. A community focused on learning and making an impact. Material is an outcomes focused company. We create experiences that matter, create new value and make a difference in people's lives. What We Offer Professional Development and Mentorship. Hybrid work mode with remote friendly workplace. (6 times in a row Great Place To Work (Certified). Health and Family Insurance. 40+ Leaves per year along with maternity & paternity leaves. Wellness, meditation and Counselling sessions. (ref:hirist.tech)
Posted 5 days ago
2.0 - 4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities Design, develop, and optimize big data pipelines using Azure Databricks, PySpark, and Delta Lake. Collaborate with Azure services like Data Lake, Synapse, SQL Database, and Data Factory to implement robust data solutions. Develop and maintain ETL/ELT workflows for efficient data ingestion, transformation, and processing. Implement data governance, security, and compliance best practices in Azure environments. Optimize Databricks clusters, workflows, and cost efficiency in cloud environments. Work closely with data scientists, analysts, and business stakeholders to ensure high-quality data solutions. Implement CI/CD pipelines for data engineering workflows using Azure DevOps. Ensure data quality, lineage, and observability using tools like Great Expectations, Unity Catalog, and Databricks Required Qualifications & Skills : Databricks Certified Data Engineer Associate (Preferred) Databricks Certified Data Engineer Professional Skills : Azure Cloud Services : Azure Databricks, Azure Data Factory, Azure Data Lake, Azure Synapse, Azure Functions Big Data & ETL : PySpark, SQL, Delta Lake, Kafka (Preferred) Programming : Python, SQL, Scala (Optional) Orchestration & Automation : Airflow, Azure DevOps, GitHub Actions Data Governance & Security : Unity Catalog, RBAC, PII masking Performance Optimization : Spark tuning, Databricks cluster configuration, Experience : 2-4 years of experience in data engineering with a focus on Azure and Databricks Experience working in high-volume, real-time streaming environments Strong understanding of data modeling, warehousing, and governance best practices (ref:hirist.tech)
Posted 5 days ago
5.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Skills Advanced working knowledge and experience with relational and non-relational databases. Experience building and optimizing Big Data pipelines, architectures, and datasets. Strong analytic skills related to working with structured and unstructured datasets. Hands-on experience in Azure Databricks utilizing Spark to develop ETL pipelines. Strong proficiency in data analysis, manipulation, and statistical modeling using tools like Spark, Python, Scala, SQL, or similar languages. Strong experience in Azure Data Lake Storage Gen2, Azure Data Factory, Databricks, Event Hub, Azure Synapse. Familiarity with several of the following technologies: Event Hub, Docker, Azure Kubernetes Service, Azure DWH, API Azure, Azure Function, Power BI, Azure Cognitive Services. Azure DevOps experience to deploy the data pipelines through CI/CD. Qualifications And Experience Minimum 5-7 years of practical experience as Data Engineer. Bachelor’s degree in computer science, software engineering, information technology, or a related field. Azure cloud stack in-production experience.
Posted 5 days ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Location: Gurgaon (Hybrid/On-site) Department: Data Engineering Reports To: Project Manager / Client Stakeholders Type: Full-Time About The Client Client is a leading data and AI/ML solutions provider, partnering with organizations across India and Australia to drive business transformation through data-driven insights. With a decade-long legacy and collaborations with technology leaders like AWS, Snowflake, Google Cloud Platform (GCP), and Databricks, BluePi delivers custom solutions that help enterprises achieve higher maturity and business outcomes. Role Overview As a Technical Lead – Data Engineer, you will play a pivotal role in designing, developing, and leading complex data projects on Google Cloud Platform and other modern data stacks. You will partner with cross-functional teams, drive architectural decisions, and ensure the delivery of scalable, high-performance data solutions aligned with business goals. Key Responsibilities Lead the design, development, and implementation of robust data pipelines, data warehouses, and cloud-based architectures. Collaborate with business and technical teams to identify problems, define methodologies, and deliver end-to-end data solutions. Own project modules, ensuring complete accountability for scope, design, and delivery. Develop technical roadmaps and architectural vision for data projects, making critical decisions on technology selection, design patterns, and implementation. Implement and optimize data governance frameworks on GCP. Integrate GCP data services (BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, GenAI) with platforms like Snowflake. Write efficient, production-grade code in Python, SQL, and ETL/orchestration tools. Utilize containerized solutions (Google Kubernetes Engine) for scalable deployments. Apply expertise in PySpark (batch and real-time), Kafka, and advanced data querying for high-volume, distributed data environments. Monitor, optimize, and troubleshoot system performance, ensuring parallelism, concurrency, and resilience. Reduce job run-times and resource utilization through architecture optimization. Develop and optimize data warehouses, including schema design and data modeling. Mentor team members, contribute as an individual contributor, and ensure successful project delivery. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. Extensive hands-on experience with Google Cloud Platform data services (BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, GenAI). Proven experience with Snowflake integration and data governance on GCP. Strong programming skills in Python, SQL, ETL, and orchestration tools. Proficiency in PySpark (batch and real-time), Kafka, and data querying tools. Experience with containerized solutions using Google Kubernetes Engine. Demonstrated ability to work with large, distributed datasets, optimizing for performance and scalability. Excellent communication skills for effective collaboration with internal teams and client stakeholders. Strong documentation skills, including the ability to articulate design and business objectives. Ability to balance short-term deliverables with long-term technical sustainability. Experience with AWS, Databricks, and other cloud data platforms. Prior leadership experience in data engineering teams. Exposure to AI/ML solution delivery in enterprise settings. Why Join Opportunity to lead high-impact data projects for a reputed client in a fast-growing data consulting environment. Work with cutting-edge technologies and global enterprise clients. Collaborative, innovative, and growth-oriented culture. Skills: cloud,dataflow,design,python,sql,snowflake,data,dataproc,google cloud,etl,orchestration tools,bigquery,cloud composer,pyspark,gcp,google kubernetes engine,genai,vertex ai studio,google cloud platform,kafka,data querying tools
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role Contributing to the Data Science efforts of AB InBevʼs global non-commercial analytics capability of Supply Chain Analytics. Candidate will be required to contribute and may also need to guide the DS team staffed on the area and assess the efforts required to scale and standardize the use of Data Science across multiple ABI markets. Key tasks & accountabilities Understand the business problem and and translate that to an analytical problem; participate in the solution design process. Working with Analytics Manager to create project plan, and design analytics roadmap. Independently lead project delivery. End to end development of AI/ML models. Ability to communicate findings clearly to both technical and business stakeholders Should be able to quantify the impact, and continuously implement improvements. Document every aspect of the project in standard ways. 3. Qualifications, Experience, Skills Level Of Educational Attainment Required B.Tech/BE/ Masters in Statistics or Economics/ econometrics, MBA Previous Work Experience Minimum 5 years of relevant experience. Preferred industry exposure CPG, Consulting with 5+ years (in case of consulting the typical profile would be of a Lead consultant with relevant experience mentioned in the point below). Experience of working in the domain of Supply Chain Analytics preferred (assessment of the pillars to be made on the past companies of the candidate) “preferably in a CPG organization” with a demonstrated capability of successfully deploying analytics solutions and products for internal or external clients. Has interacted with Senior internal or external stakeholders around project/ service conceptualization and plan of delivery. Technical Skills Required Hands-on experience in data manipulation using Excel, Python, SQL. Expert level proficiency in Python (knowledge of writing end-to-end ML or data pipelines in python). Proficient in application of ML concepts and optimization techniques to solve end-to-end business problems Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform. Other Skills Required Demonstrated leadership skills. Passion for solving problems using data. Detail oriented, analytical and inquisitive. And above all of this, an undying love for beer! We dream big to create future with more cheers
Posted 5 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role Contributing to the Data Science efforts of AB InBevʼs global commercial analytics capability of Sales & Distribution Analytics. Work with internal stakeholders at ABI to understand their business problems, translate those problems into statistical problems like survey design, experiment design, optimization, forecasting, etc which can best address those business problems, work with statistical experts to develop robust models and generate business insights. Key tasks & accountabilities Understand the business problem and translate that to an analytical problem; participate in the solution design process. Storyboarding and presenting the insights to stakeholders & senior leadership. Independently lead project delivery. Working with Analytics Manager to create project plan, and design analytics roadmap. End to end development and deployment of machine learning or deep learning models. Ability to communicate findings clearly to both technical and business stakeholders. Should be able to quantify the impact, and continuously implement improvements. Document every aspect of the project in standard ways. Summarize insights and recommendations to be presented back to the business. Use innovative methods to continuously improve the quality of statistical models. 3. Qualifications, Experience, Skills Level Of Educational Attainment Required. Bachelor’s or master’s degree in engineering, Statistics, Applied Statistics, Economics, Econometrics, Operations Research or any other quantitative analysis. Previous work experience required: 6+ years in data science role, preferably in CPG domain Expert level proficiency in Python (knowledge of classes, decorators, written end-to-end ML or Software or data pipelines in python) Experience working with SQL (knowledge of data warehouses, different databases, and fundamentals about RDBMS) Experience working with Azure (ADLS, Databricks, Azure SQL or Postgres, app services and related) Well versed with Machine learning and Deep learning algorithms implementation Exposure to working with complex datasets, machine learning and DL libraries like scikit-learn, TensorFlow, Keras, Pytorch etc. Capable of building insightful visualizations in Python Good to have –category management, Optimization techniques, knowledge of html, CSS, JS, YAML/docker & has worked on Dash /Flask /Django or any web application framework. Technical Skills Required Hands-on experience in data manipulation using Excel, Python, SQL. Expert level proficiency in Python (knowledge of writing end-to-end ML or data pipelines in python). Proficient in application of ML concepts and optimization techniques to solve end-to-end business problems. Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform. Other Skills Required. Demonstrated leadership skills. Passion for solving problems using data. Detail oriented, analytical, and inquisitive Ability to work independently and with others. Takes responsibility and makes effective decisions. Problem solving Planned and organized. And above all of this, an undying love for beer! We dream big to create future with more cheers.
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role As a Senior Data Scientist , you will play a pivotal role in advancing AB InBev’s global Non-Commercial Analytics initiatives. You will lead data science efforts, guiding the team to scale and standardize AI-driven solutions across multiple ABI markets. This role requires a strong mix of hands-on technical expertise, stakeholder management, and a vision for driving AI innovation at scale. Key tasks & accountabilities Lead and execute end-to-end data science projects, from data collection and preprocessing to model development, deployment, and performance monitoring. Mentor and guide junior data scientists, fostering a culture of innovation and continuous learning. Work closely with cross-functional stakeholders to understand business needs, gather feedback, and align AI solutions with strategic objectives. Present complex insights to non-technical stakeholders, translating data-driven findings into actionable business strategies. Hands-on experience with AI/ML methodologies, including forecasting, clustering, regression, classification, optimization, deep learning, and Generative AI. Expertise in data manipulation using Python, SQL, and Excel. Strong understanding of Object-Oriented Programming (OOP), data structures, and algorithms for machine learning applications. Experience with OCR technologies (both open-source and enterprise-grade tools). Exposure to ML Ops and containerization tools like Docker is a plus. Familiarity with Azure Tech Stack, Databricks, and ML Flow. Drive the development of scalable AI/ML solutions that can be leveraged across multiple markets. Passion for building large-scale, AI-powered products, ensuring seamless integration with business workflows. Stay updated with emerging trends in AI and data science, continuously improving methodologies and tools. 3. Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) B.Tech/BE/ Masters in Statistics or Economics/ econometrics, MBA. Previous Work Experience Minimum 5 years of relevant experience. Technical Skills Required Strong programming skills in Python (ability to build end-to-end ML/data pipelines is a plus). Experience with Large Language Models (LLMs) and their application in solving business problems. Hands-on expertise in statistical analysis, ML concepts, and optimization techniques. Experience in building and deploying large-scale software applications. Proficiency in Git for version control. Strong problem-solving and critical thinking abilities. Passion for solving complex business challenges using data-driven insights. Detail-oriented, analytical, and highly inquisitive. Curious, fast learner, and a strong team player. Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. And above all of this, an undying love for beer! We dream big to create future with more cheers
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Senior Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role Contributing to the Data Science efforts of AB InBevʼs global non-commercial analytics capability of Supply Chain Analytics. Candidate will be required to contribute and may also need to guide the DS team staffed on the area and assess the efforts required to scale and standardize the use of Data Science across multiple ABI markets. Key tasks & accountabilities Understand the business problem and and translate that to an analytical problem; participate in the solution design process Working with Analytics Manager to create project plan, and design analytics roadmap Independently lead project delivery End to end development of AI/ML models Ability to communicate findings clearly to both technical and business stakeholders Should be able to quantify the impact, and continuously implement improvements Document every aspect of the project in standard ways 3. Qualifications, Experience, Skills Level Of Educational Attainment Required B.Tech/BE/ Masters in Statistics or Economics/ econometrics, MBA Previous Work Experience Minimum 5 years of relevant experience Preferred industry exposure CPG, Consulting with 5+ years (in case of consulting the typical profile would be of a Lead consultant with relevant experience mentioned in the point below) Experience of working in the domain of Supply Chain Analytics preferred (assessment of the pillars to be made on the past companies of the candidate) “preferably in a CPG organization” with a demonstrated capability of successfully deploying analytics solutions and products for internal or external clients Has interacted with Senior internal or external stakeholders around project/ service conceptualization and plan of delivery. Technical Skills Required Hands-on experience in data manipulation using Excel, Python, SQL Expert level proficiency in Python (knowledge of writing end-to-end ML or data pipelines in python) Proficient in application of ML concepts and optimization techniques to solve end-to-end business problems Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform Other Skills Required Demonstrated leadership skills. Passion for solving problems using data. Detail oriented, analytical and inquisitive. Ability to work independently and with others. And above all of this, an undying love for beer! We dream big to create future with more cheers .
Posted 5 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager Software Development Engineering – Safety ART – Tech Enablement Team What You Will Do Let’s do this. Let’s change the world. In this vital role you will become an influential strategic IT partner within the Global Patient Safety (GPS) organization. This role is accountable for end-to-end service delivery leading and manage a team of engineers supporting multiple product teams for the Safety IT technology and its integrations. As we stride towards the future, this role will be pivotal in advancing the next generation of Safety platforms and supporting technologies. As Sr. Manager, you will lead a team delivering innovative solutions and shared technology services to support the Global Patient Safety (GPS) organization. This role combines technical leadership and strategic guidance to advance safety platforms, integrate innovative technologies, and drive efficiency and innovation across systems. Roles & Responsibilities: Oversee the delivery of shared services across the safety product teams, ensuring high-quality solutions and alignment with organizational goals. Acts as a liaison between team and product teams, ensuring technical solutions align with project requirements and timelines for the Global Patient Safety organization Provide guidance on the design, development, and deployment of technical solutions Ensures adherence to best practices in software engineering, including code quality platform scalability, and system reliability Stay updated on new technologies and trends to recommend innovative solutions. Take a leadership role in the management of the technical enablement team and contract workers (CW’s) by educating, motivating, and guiding in delivery and maintenance of this service delivery to enrich business area strategy Champion innovation to elevate Amgen's Safety systems, empowering the business to improve its processes, efficiency, and effectiveness. This involves integrating cognitive capabilities to thoughtfully automate traditional, manual processes. Promote the adoption of global technology capabilities and standards to address complex business challenges. Lead strategic and operational activities, including securing funding, managing RFP processes, assessing solution options, forecasting, resource and demand planning, vendor coordination, and overseeing run and build operations across product teams. Oversee, mentors and lead a dedicated team comprised of engineers. Improve procedures associated with Maintenance and Support. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 8 to 10 years of Computer Science, IT or related field experience OR Bachelor’s degree with 10 - 14 years in Computer Science, IT or related field experience OR Diploma and 14 to 18 years of computer science and engineering preferred, other Engineering field experience will be considered Preferred Qualifications: Proven experience 5+ years in the IT Industry with work experience in Analysis, Design, Development, Testing and Maintenance of software applications in a SaaS environment or in a Health & Life Sciences industry. 5+ years of experience in a technical or engineering leadership role Experience of working on and leading major programs/projects that involve multiple partners and external vendors and leading end-to-end from initiation to project closure Knowledge of Artificial Intelligence (AI), Robotic Process Automation (RPA), Machine Learning (ML), Natural Language Processing (NLP) and Natural Language Generation (NLG) automation technologies with building business requirements Strong communications skills in writing, speaking, making pitches to various audiences in a clear and concise manner Experience in database programming and knowledge of concepts in SQL (e.g. Oracle or Postgres) Experienced with cloud computing technologies (e.g. AWS, Azure) and integration technologies (e.g. Mulesoft, databricks). Experience in DevOps, and Scaled Agile Framework (SAFe), especially in regulated setting including ability to lead the transformation of teams from a service based to a product-based model Experience with Quality Control and Quality Assurance processes and systems Demonstrable experience in group facilitation – ability to guide teams to make decisions and achieve results within agreed parameters & timescales Strong background in conflict resolution and fostering teamwork between technical teams and customers In-depth experience in all aspects of SDLC from requirements, design, testing, data analysis and Change Control process combined with experience developing project charters, statement of work and project financials Experienced in leading vendor relationships, contract negotiations, and ensuring alignment with long-term technology solutions An ongoing commitment to learning and staying at the forefront of AI/ML advancements. Good-to-Have Skills: Demonstrated expertise in a scientific domain area and related technology needs Understanding of scientific data strategy, data governance, data infrastructure Experience with stakeholder management, leading a team of 20, ensuring seamless coordination across teams and driving the successful delivery of technical projects Familiarity with data analytics and visualization platforms such as Databricks, Spotfire, Tableau, Power BI, and Cognos, combined with strong programming skills in languages like SQL and Python for data processing and analysis. Experience creating impactful slide decks and presenting data Ability to drive projects/company initiatives using Agile methodology We understand that to successfully sustain and grow as a global enterprise and deliver for patients — we must ensure a diverse and inclusive work environment. Extensive experience in managing and delivering technology solutions in a GxP environment Knowledge of drug safety databases and tools such as Argus or ArisG, including an understanding of adverse event reporting requirements. Professional Certifications: SAFe for Teams certification (preferred) Advanced certifications in cloud technologies (e.g. AWS Solutions Architect) Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global teams High degree of initiative and self-motivation Ability to lead multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills in translating technical insights into impactful narratives for senior executives Ability to deal with ambiguity and think on their feet Ability to influence and drive to an intended outcome Ability to hold team members accountable to commitments Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
8.0 - 13.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Bus Sys Analyst, Neural Nexus What You Will Do Let’s do this. Let’s change the world. In this vital role you will support the delivery of emerging AI/ML capabilities within Amgen's Neural Nexus program. As part of the Commercial Technology Data & Analytics team, you will collaborate with product owners and cross-functional partners to help design, implement, and iterate on a layered ecosystem passionate about DIAL (Data, Insights, Action, and Learning). Collaborate with the Commercial Data & Analytics (CD&A) team to help realize business value through the application of commercial data and emerging AI/ML technologies. Support delivery activities within the Scaled Agile Framework (SAFe), partnering with Engineering and Product Management to shape roadmaps, prioritize releases, and maintain a refined product backlog. Contribute to backlog management by helping break down Epics into Features and Sprint-ready User Stories, ensuring clear articulation of requirements and well-defined Acceptance Criteria and Definitions of Done. Ensure non-functional requirements are represented and prioritized within the backlog to maintain performance, scalability, and compliance standards. Collaborate with UX to align technical requirements, business processes, and scenarios with user-centered design. Assist in the development and delivery of engaging product demonstrations for internal and external partners. Support documentation efforts to maintain accurate records of system configurations, processes, and enhancements. Contribute to the launch and growth of Neural Nexus product teams focused on data connectivity, predictive modeling, and fast-cycle value delivery for commercial teams. Provide input for governance discussions and help prepare materials to support executive alignment on technology strategy and investment. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Specialist IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree with 8 to 13 years of experience in Information Systems Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
4.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Business Analyst, Conversational AI What You Will Do Let’s do this. Let’s change the world. In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with Business SMEs, Data Engineers and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Conversational AI product team Become a domain authority in Conversational AI technology capabilities by researching, deploying, and sustaining features built according to Amgen’s Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with UX to align technical requirements, scenarios, and business process maps with User Experience designs Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team Implement and supervise performance of Extract, Transform, and Load (ETL) jobs What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Specialist IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: Master’s degree with 4 - 6 years of experience in Information Systems Bachelor’s degree with 6 - 8 years of experience in Information Systems Diploma with 10 - 12 years of experience in Information Systems Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: We are seeking an experienced MDM Data Analyst with 5–8 years of experience on MDM development and implementation and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica IDQ and Informatica MDM. This role will involve hands-on MDM implementation of MDM solutions using IDQ and Informatica MDM. To succeed in this role, the candidate must have strong IDQ and Informatica MDM technical experience. Roles & Responsibilities: Develop and implement MDM solutions using Informatica IDQ and Informatica MDM platforms. Define enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Match/Merge and Survivorship strategy and implementation Design and delivery of MDM processes and data integrations using Unix, Python, and SQL. Collaborate with backend data engineering team and frontend custom UI team for strong integrations and a seamless enhanced user experience respectively Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Basic Qualifications and Experience: Master’s degree with 4 - 6 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 5 - 8 years of experience in Business, Engineering, IT or related field OR Diploma with 10 - 12 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Deep knowledge of MDM tools (Informatica MDM) and data quality frameworks (IDQ) from configuring data assets to building end to end data pipelines and integrations for data mastering and orchestrations of ETL pipelines Very good understanding on reference data, hierarchy and its integration with MDM Hands on experience with custom workflows AVOS, Eclipse etc Strong experience with external data enrichment services like Address doctor etc Strong experience on match/merge and survivorship rules strategy and implementations Strong experience with group fields, cross reference data and UUIDs Strong understanding of AWS cloud services and Databricks architecture. Proficiency in Python, SQL, and Unix for data processing and orchestration. Experience with data modeling, governance, and DCR lifecycle management(Avos). Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Must have worked on atleast 3 end to end implementations of MDM Hands on Unix and Advance sql Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Role GCF: 04A
Posted 5 days ago
8.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal Data Engineer What You Will Do Let’s do this. Let’s change the world. We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, using AWS or other preferred platforms. Lead and motivate an impactful data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 8 to 10 years of computer science and engineering preferred, other Engineering field is considered OR Bachelor’s degree and 10 to 14 years of computer science and engineering preferred, other Engineering field is considered; Diploma and 14 to 18 years of in computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop impactful data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Mgr Software Development Engineering What You Will Do Let’s do this. Let’s change the world. In this vital role you will Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with partner Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop organisational change strategies and assist in their implementation. Mentor junior data engineers on standard processes in the industry and in the Amgen data landscape What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of relevant experience Must-Have Skills: Superb communication and interpersonal skills, with the ability to work closely with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Solid understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and solving skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 5 days ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, build, and support data ingestion, transformation, and delivery pipelines across structured and unstructured sources within the enterprise data engineering. Manage and monitor day-to-day operations of the data engineering environment, ensuring high availability, performance, and data integrity. Collaborate with data architects, data governance, platform engineering, and business teams to support data integration use cases across R&D, Clinical, Regulatory, and Commercial functions. Integrate data from laboratory systems, clinical platforms, regulatory systems, and third-party data sources into enterprise data repositories. Implement and maintain metadata capture, data lineage, and data quality checks across pipelines to meet governance and compliance requirements. Support real-time and batch data flows using technologies such as Databricks, Kafka, Delta Lake, or similar. Work within GxP-aligned environments, ensuring compliance with data privacy, audit, and quality control standards. Partner with data stewards and business analysts to support self-service data access, reporting, and analytics enablement. Maintain operational documentation, runbooks, and process automation scripts for continuous improvement of data fabric operations. Participate in incident resolution and root cause analysis, ensuring timely and effective remediation of data pipeline issues. Create documentation, playbooks, and best practices for metadata ingestion, data lineage, and catalog usage. Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must have Skills: Build and maintain data pipelines to ingest and update metadata into enterprise data catalog platforms in biotech or life sciences or pharma. Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. experience in data engineering, data operations, or related roles, with at least 2+ years in life sciences, biotech, or pharmaceutical environments. Experience with cloud platforms (e.g., AWS, Azure, or GCP) for data pipeline and storage solutions. Understanding of data governance frameworks, metadata management, and data lineage tracking. Strong problem-solving skills, attention to detail, and ability to manage multiple priorities in a dynamic environment. Effective communication and collaboration skills to work across technical and business stakeholders. Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Preferred Qualifications: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Basic Qualifications: Master’s degree and 3 to 4 + years of Computer Science, IT or related field experience Bachelor’s degree and 5 to 8 + years of Computer Science, IT or related field experience Diploma and 7 to 9 years of Computer Science, IT or related field experience Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills : Excellent verbal and written communication skills. High degree of professionalism and interpersonal skills. Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
23962 Jobs | Dublin
Wipro
12595 Jobs | Bengaluru
EY
8867 Jobs | London
Accenture in India
7480 Jobs | Dublin 2
Uplers
7207 Jobs | Ahmedabad
Amazon
6884 Jobs | Seattle,WA
IBM
6543 Jobs | Armonk
Oracle
6473 Jobs | Redwood City
Muthoot FinCorp (MFL)
6161 Jobs | New Delhi
Capgemini
5121 Jobs | Paris,France