Jobs
Interviews

5786 Databricks Jobs - Page 43

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 17.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do Let’s do this. Let’s change the world. In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with pharma / bio-technology operations (supply chain, manufacturing, quality) Business SMEs, Data Engineers, Data Scientists and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Data & AI Product Teams Become a pharma / bio-technology operations (supply chain, manufacturing, quality) domain authority in Data & AI technology capabilities by researching, deploying, and sustaining features built according to Amgen’s Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with Business SME’s, Data Scientists, ML Engineers to understand the requirements around Data product requirements, KPI’s etc. Analyzing the source systems and create the STTM documents. Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Principal IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: 12 to 17 years of experience in pharma / bio-technology operations (supply chain, manufacturing, quality) Information Systems Mandatory work experience in acting as a business analyst in DWH, Data product building, BI & Analytics Applications. Experience in Analyzing the requirements of BI, AI & Analytics applications and working with Data Source SME, Data Owners to identify the data sources and data flows Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

5.0 - 9.0 years

10 - 20 Lacs

Hyderabad, Ahmedabad, Bengaluru

Work from Office

Sr. Data Analytics Engineer Power mission-critical decisions with governed insights Ajmera Infotech builds planet-scale software for NYSE-listed clients, driving decisions that can’t afford to fail. Our 120-engineer team specializes in highly regulated domains—HIPAA, FDA, SOC 2—and delivers production-grade systems that turn data into strategic advantage. Why You’ll Love It End-to-end impact — Build full-stack analytics from lakehouse pipelines to real-time dashboards. Fail-safe engineering — TDD, CI/CD, DAX optimization, Unity Catalog, cluster tuning. Modern stack — Databricks, PySpark, Delta Lake, Power BI, Airflow. Mentorship culture — Lead code reviews, share best practices, grow as a domain expert. Mission-critical context — Help enterprises migrate legacy analytics into cloud-native, governed platforms. Compliance-first mindset — Work in HIPAA-aligned environments where precision matters. Key Responsibilities Build scalable pipelines using SQL, PySpark, Delta Live Tables on Databricks. Orchestrate workflows with Databricks Workflows or Airflow; implement SLA-backed retries and alerting. Design dimensional models (star/snowflake) with Unity Catalog and Great Expectations validation. Deliver robust Power BI solutions —dashboards, semantic layers, paginated reports, DAX. Migrate legacy SSRS reports to Power BI with zero loss of logic or governance. Optimize compute and cost through cache tuning, partitioning, and capacity monitoring. Document everything —from pipeline logic to RLS rules—in Git-controlled formats. Collaborate cross-functionally to convert product analytics needs into resilient BI assets. Champion mentorship by reviewing notebooks, dashboards, and sharing platform standards. Must-Have Skills 5+ years in analytics engineering, with 3+ in production Databricks/Spark contexts. Advanced SQL (incl. windowing), expert PySpark , Delta Lake , Unity Catalog . Power BI mastery —DAX optimization, security rules, paginated reports. SSRS-to-Power BI migration experience (RDL logic replication). Strong Git, CI/CD familiarity, and cloud platform know-how (Azure/AWS). Communication skills to bridge technical and business audiences. Nice-to-Have Skills Databricks Data Engineer Associate cert. Streaming pipeline experience (Kafka, Structured Streaming). dbt , Great Expectations , or similar data quality frameworks. BI diversity—experience with Tableau, Looker, or similar platforms. Cost governance familiarity (Power BI Premium capacity, Databricks chargeback). Benefits & Call-to-Action Ajmera offers competitive compensation, flexible schedules, and a deeply technical culture where engineers lead the narrative. If you’re driven by reliable, audit-ready data products and want to own systems from raw ingestion to KPI dashboards— apply now and engineer insights that matter.

Posted 1 week ago

Apply

0 years

0 Lacs

Krishnagiri, Tamil Nadu, India

On-site

Experience with cloud databases and data warehouses (AWS Aurora, RDS/PG, Redshift, DynamoDB, Neptune). Building and maintaining scalable real-time database systems using the AWS stack (Aurora, RDS/PG, Lambda) to enhance business decision-making capabilities. Provide valuable insights and contribute to the design, development, and architecture of data solutions. Experience utilizing various design and coding techniques to improve query performance. Expertise in performance optimization, capacity management, and workload management. Working knowledge of relational database internals (locking, consistency, serialization, recovery paths). Awareness of customer workloads and use cases, including performance, availability, and scalability. Monitor database health and promptly identify and resolve issues. Maintain comprehensive documentation for databases, business continuity plan, cost usage and processes. Proficient in using Terraform or Ansible for database provisioning and infrastructure management. Additional 'nice-to-have' expertise in Python, Databricks, Apache Airflow, Google Cloud Platform (GCP), and Microsoft Azure.

Posted 1 week ago

Apply

5.0 - 9.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Responsibilities: Be part of the DevOps Platform team who own the Data reservoir platform Code/Tools/Processes Participate in the design and architecture of big data solutions. Design, develop, and maintain data patterns. Design, Develop and maintain Automation Testing Frameworks. Expertise in GIT and CI/CD. Optimize and tune Spark code to ensure high performance and scalability. Automate processes and react quickly to continuously improve processes. Write clear and concise documentation for Python/Spark-developed code. Requirements: Experience is working in DevOps Setup Experience in working with Git Repositories. Experience with Spark SQL, Spark Streaming. Experience with batch and streaming data processing using Spark. Experience in building RESTful APIs is a plus. Experience in using databases such as DB2,Oracle,Hadoop, Hive, Postgres Have high levels of Ownership and accountability for the undertaken tasks. Strong problem-solving and analytical skills. Excellent written and verbal communication skills. Ability to work independently as well as part of a team. Strong attention to detail and accuracy. Strong knowledge of Agile methodologies. Exposure to Cloudera and Azure Platform (Microsoft fabric/Databricks/Data Factory /Synapse) will be an advantage Responsibilities: Be part of the DevOps Platform team who own the Data reservoir platform Code/Tools/Processes Participate in the design and architecture of big data solutions. Design, develop, and maintain data patterns. Design, Develop and maintain Automation Testing Frameworks. Expertise in GIT and CI/CD. Optimize and tune Spark code to ensure high performance and scalability. Automate processes and react quickly to continuously improve processes. Write clear and concise documentation for Python/Spark-developed code. Requirements: Experience is working in DevOps Setup Experience in working with Git Repositories. Experience with Spark SQL, Spark Streaming. Experience with batch and streaming data processing using Spark. Experience in building RESTful APIs is a plus. Experience in using databases such as DB2,Oracle,Hadoop, Hive, Postgres Have high levels of Ownership and accountability for the undertaken tasks. Strong problem-solving and analytical skills. Excellent written and verbal communication skills. Ability to work independently as well as part of a team. Strong attention to detail and accuracy. Strong knowledge of Agile methodologies. Exposure to Cloudera and Azure Platform (Microsoft fabric/Databricks/Data Factory /Synapse) will be an advantage Experience: 5+ years Qualification: BTech/BE

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About The Team Data is at the core of Outreach's strategy. It drives ourselves and our customers to the highest levels of success. We use it for everything from customer health scores and revenue dashboards to operational metrics of our AWS infrastructure, to helping increase product engagement and user productivity through natural language understanding, to predictive analytics and causal inference via experimentation. As our customer base continues to grow, we are looking towards new ways of leveraging our data to deeper understand our customers’ needs and deliver new products and features to help continuously improve their customer engagement workflows. The mission of the Data Science team is to enable such continuous optimization by reconstructing customer engagement workflows from data, developing metrics to measure the success and efficiency of these workflows, and providing tools to support the optimization of these workflows. As a member of the team, you will work closely with other data scientists, machine learning engineers, and application engineers to define and implement our strategy for delivering on this mission Your Daily Adventures Will Include Design, implement, and improve machine learning Systems. Contribute to machine learning applications end to end, i.e. from research to prototype to production. Work with product managers, designers, and customers to define vision and strategy for a given product. Our Vision Of You A hybrid data science engineer who can navigate both sides with little help from others You understand the typical lifecycle of machine learning product development, from inception to production. Experience in Gen AI application/agent development is a plus. You have strong programming skills in at least one programming language (Python, Golang, etc.). Experience in framework like Langchain, OpenAI Agent SDK is a plus. You have experience building microservices. Experience with Golang is a plus. You have substantial experience with building and managing infrastructure for deploying and running ML models in production You have experience working with distributed data processing frameworks such as Spark. Experience with Spark's MLlib, AWS, Databricks, MLFlow are a plus You have a knowledge in statistics and machine learning and have practical experience applying it to solve real-world problems. You are hands-on, able to quickly pick up new tools and languages, and excited about building things and experimenting. You go above and beyond to help your team You should be able to work alongside experienced engineers, designers, and product managers to help deliver new customer-facing features and products. You have an degree in Computer Science, Data Science, or a related field, and 4-6 years of industry or equivalent experience

Posted 1 week ago

Apply

0 years

10 - 18 Lacs

Pune, Maharashtra, India

On-site

Hybrid work mode (Azure) EDW Experience working in loading Star schema data warehouses using framework architectures including experience loading type 2 dimensions. Ingesting data from various sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures. Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert), Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation. Skills:- Windows Azure, SQL Azure, SQL, Data Warehouse (DWH), Data Analytics, Python, Star schema and Datawarehousing

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role : ML Ops Support Engineer *Job description:* We are looking for a skilled MLOps Support Engineer to join our team. This role involves monitoring and managing ML model operational pipelines in AzureML and MLflow, with an emphasis on automation, integration validation, and CI/CD pipeline management. The ideal candidate will be technically sound in Python, Azure CLI, and MLOps tools, and capable of ensuring stability and reliability in model deployment lifecycles. *Objectives of the role:*  Support and monitor MLOps pipelines in AzureML and MLflow  Manage CI/CD pipelines for model deployment and updates  Handle model registry processes, ensuring best practices for versioning and tracking  Perform testing & validation of integrated endpoints to ensure non-functional stability  Automate monitoring and upkeep of ML pipelines to relieve the data science team  Troubleshoot and resolve pipeline and integration-related issues *Responsibilities:*  Support production ML pipelines using AzureML and MLflow  Configure and manage model versioning and registry lifecycle  Automate alerts, monitoring tasks, and routine pipeline operations  Validate REST API endpoints for ML models  Implement CI/CD workflows for ML deployments  Document and troubleshoot operational issues related to ML services  Collaborate with data scientists and platform teams to ensure delivery continuity *Required Skills & Qualifications:*  Proficiency in AzureML, MLflow, and Databricks  Strong command over Python  Experience with Azure CLI and scripting  Good understanding of CI/CD practices in MLOps  Knowledge of model registry management and deployment validation  3–5 years of relevant experience in MLOps environments Skills that are good to have, but not mandatory:  Exposure to monitoring tools (e.g., Azure Monitor, Prometheus)  Experience with REST API testing (e.g., Postman)  Familiarity with Docker/Kubernetes in ML deployments

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Data Engineer role and are looking for professionals with strong expertise in Spark and SQL to join our dynamic team. This position offers the opportunity to work on modern data platforms with technologies like Azure Synapse , Databricks , and Apache Spark (PySpark) . Key Responsibilities include: Designing and developing scalable data pipelines using Azure Synapse, Databricks, and PySpark Integrating data from various sources with a focus on quality and consistency Optimizing workflows for performance and cost-efficiency Collaborating with cross-functional teams to deliver reliable data solutions Monitoring pipelines and ensuring data integrity Documenting workflows and best practices Skills We're Looking For : Technical: PySpark, SQL, Azure Synapse, Databricks, Delta Live Tables, ETL processes, cloud platforms (Azure, AWS, GCP) Soft Skills: Strong problem-solving, communication, and the ability to work in a fast-paced, collaborative environment

Posted 1 week ago

Apply

7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues—at all levels—will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and exposure that will accelerate your growth in ways you won’t find anywhere else. When you join us, you will have: Continuous learning: Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development. The real magic happens when you take the input from others to heart and embrace the fast-paced learning experience, owning your journey. A voice that matters: From day one, we value your ideas and contributions. You’ll make a tangible impact by offering innovative ideas and practical solutions. We not only encourage diverse perspectives, but they are critical in driving us toward the best possible outcomes. Global community: With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions for our clients. Plus, you’ll have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences. World-class benefits: On top of a competitive salary (based on your location, experience, and skills), we provide a comprehensive benefits package to enable holistic well-being for you and your family. Your Impact As a product leader, you will be responsible for shaping and driving product plans, ensuring alignment with business strategy, user needs, and market trends. You will define and execute comprehensive product strategies that align with client goals across brands, markets, and products, while continually improving existing offerings. Leveraging insights from customer and competitor research, you will prioritize and guide the development of innovative solutions, translating business needs into functional requirements. Your role will involve close collaboration with technology, support, and client service teams to manage product updates and deployments, as well as defining and tracking KPIs to identify opportunities for growth and improvement. Additionally, you will lead workshops and training sessions to showcase product features, ensuring both internal teams and customer users are equipped to maximize value. In this role, you will play a pivotal part in steering cross-functional teams, including product owners and engineering delivery teams, to design and deliver impactful solutions that address business challenges. You will build and maintain key relationships across the organization to deliver prioritized product roadmaps and partner with marketing, sales, and partner organizations to develop effective go-to-market strategies. Accountability for the growth and success of the products will be central to your responsibilities, as will fostering innovation and creativity. Based in McKinsey’s Periscope team in Gurgaon or Bangalore, India, you will contribute to the technology backbone of McKinsey’s Growth, Marketing & Sales Practice. Periscope combines world-class intellectual property, prescriptive analytics, and cloud-based tools to drive revenue growth and sustain commercial transformation for businesses globally. The Growth, Marketing & Sales Practice strives to help clients in both consumer and business-to-business environments on a wide variety of marketing and sales topics. Our clients benefit from our experience in core areas of sales and marketing topics such as sales and channel management, branding, customer insights, marketing ROI, digital marketing, CLM, and pricing. Our Practice offers an exceptional opportunity to work at the intersection of sales, marketing, and consulting. Focusing on issues like redefining sales and marketing operations and commercial transformation, our people help clients build capabilities and transform how companies go to market-moving them to customer-centric organizations. Periscope leverages its world-leading IP (largely from McKinsey but also other partners) and best-in-class technology to enable transparency into Big Data, create actionable insights, and new ways of working that drive lasting performance improvement, and typically sustain a 2-7% increase in return on sales (ROS). With a truly global reach, the portfolio of solutions is comprised of: Marketing Solutions, Customer Experience Solutions, Category Solutions, B2C Pricing Solutions, B2B Pricing Solutions, and Sales Solutions. These are complemented by ongoing client service and custom capability building programs. Periscope has a presence in 27 locations across 16 countries with a team of 800+ IT and business professionals and a network of 300+ experts. To learn more about how Periscope’s solutions and experts are helping businesses continually drive better performance, visit www.mckinsey.com/periscope Your Qualifications and Skills Bachelor’s degree in computer science, engineering, or a related field 7+ years of total experience in technical product or project management within a data platform or enterprise technology environment Hands-on experience managing data platform initiatives and leading agile delivery teams Proficiency in tools like Jira and Confluence for project planning and execution Strong understanding of cloud-native data technologies (Azure, AWS, or GCP; Spark; Delta Lake; Databricks preferred) Proven ability to manage cross-team collaboration and deliver operational efficiency Good understanding of user/business needs, structuring backlogs, prioritization, measuring success, communicating the value of a product Software engineering knowledge Understanding of factors influencing price, basic margin/cost calculations

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Level - Specialist / Developer Experience – 4 Years to 6 Years Location - Bangalore, Chennai, Hyderabad, Noida Job Title - AI / ML - Data science in Test- 17464 Primary Skil l- Programming Language Python About Us Qualitest, The World’s Leading AI-Powered Quality Engineering Company Qualitest is the world’s leading managed services provider of AI-led quality engineering solutions. It helps brands transition through the digital assurance journey and make the move from conventional functional testing to adopt innovations such as automation, AI, blockchain, and XR. Qualitest’s core mission is to mitigate business risks associated with digital adoption. It fulfills this through customized quality engineering solutions that leverage Qualitest’s deep, industry-specific knowledge for various sectors, including technology, telecommunications, finance, healthcare, media, utilities, retail, manufacturing, and defense. These scalable solutions protect brands through end-to-end value demonstration with a focus on customer experience and release velocity. A pioneer and innovator in its industry, Qualitest has been recognized in the highest Leader position in Everest Group's Quality Engineering Services for Mid-market Enterprises PEAK Matrix® Assessment 2024 report and has also been recognized as a Leader in The Forrester Wave™️: Continuous Automation and Testing Services Q2 2024 report. Qualitest has offices in the United States, United Kingdom, Israel, Romania, India, Mexico, Portugal, Switzerland, and Argentina Role & Responsibility Must Have 5+ Years Experience in Data Science in Test (AI/ML/Testing) R language experience is a must Ability to learn new AI/ML tools and use them effectively Experience creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc. Demonstrated experience data retrieval from various SDLC systems and analytic reporting databases. Familiarity with databases and database query languages such as MySQL, Access, and SQL. Experience with Python and other programming languages. Ability to interface effectively with clients and work constructively on a team. Excellent verbal and written communications skills. Experience with cloud computing environments and data science tools such as AWS, Sage Maker, GCP and Databricks. Experience with NLP algorithms such as LSA, LDA, and QNLI. Experience with JIRA, Confluence, and GitHub. Knowledge of software engineering and structured software development. Demonstrated ability performing statistical data analysis. 3 Must Skills AI / ML R language Python LLM

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Years of relevant Experience – 5-7 Job Description: Key Responsibilities Data Management: Collect, analyze, and interpret large datasets to provide actionable insights. Data Solutions: Develop and manage Advanced Analytics solutions, including BI Dashboards design, Reports and digital solutions. Stakeholder Management: Collaborate with stakeholders to understand business needs and translate them into technical requirements. Project Management: Lead and manage Data and Advance Analytics projects, ensuring timely delivery and alignment with business goals. Documentation: Create and maintain documentation, including design specifications and user manuals. Continuous Improvement: Identify opportunities for process improvements and recommend digital solutions. Experience:Must Have Experience with Databricks for big data processing and analytics.Strong skills in SQL for database querying, Data Modeling and DWH (Must)Develop design documentation by translating business requirements into surce-to-target mappings. (Must)Must have experience in Power BI and Qlik Sense; development background is an advantageExperience with Azure services for data storage, processing, and analytics.Knowledge of data fabric architecture and implementation.Azure Data Factory (ADF): Expertise in data integration and orchestration using ADF (Advantage).Power Platform: Proficiency in using Power Apps, Power Automate, and Power Virtual Agents to create and manage digital solutions.AI Technologies: Knowledge of AI tools and frameworks to develop predictive models and automate data analysis. 3 must haves AI 4/5 DWH 4/5 Data mgmt 3/5

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Crunchyroll Founded by fans, Crunchyroll delivers the art and culture of anime to a passionate community. We super-serve over 100 million anime and manga fans across 200+ countries and territories, and help them connect with the stories and characters they crave. Whether that experience is online or in-person, streaming video, theatrical, games, merchandise, events and more, it’s powered by the anime content we all love. Join our team, and help us shape the future of anime! About The Role Crunchyroll is growing and changing, presenting unique challenges and opportunities to support millions of anime fans around the world. The Data Engineering team provides seamless help to our internal stakeholders, ensuring an exceptional experience for all Crunchyroll fans. We are seeking a Data Analyst with strong SQL skills and experience in dashboard development, data governance, and analytics. The ideal candidate will work closely with stakeholders across the organization to turn raw data into actionable insights, ensuring data accuracy, consistency, and compliance. You will play a key role in optimizing reporting frameworks, building data visualizations, and supporting data-driven decision-making. Key Responsibilities: Write efficient, optimized SQL queries to extract, transform, and analyze data from relational databases (e.g., RDS, Snowflake, Databricks). Develop, maintain, and enhance interactive dashboards and reports using BI tools such as Tableau, Looker. Ensure data quality, integrity, and governance by implementing best practices in data validation, lineage tracking, and documentation. Collaborate with cross-functional teams to define key metrics, create standardized reporting frameworks, and improve data accessibility. Support ad-hoc analysis to answer critical business questions and drive strategic decision-making. Partner with data engineering teams to improve ETL/ELT pipelines, ensuring clean and structured data for analysis. Monitor and analyze cloud and database usage costs, identifying opportunities to optimize query performance, storage, and infrastructure costs. Work with stakeholders to understand business needs and translate them into data-driven insights and reports. Implement data governance best practices to ensure compliance with privacy, security, and regulatory requirements. About You We get excited about candidates, like you, because... Bachelor’s degree in Data Science, Statistics, Computer Science, Business Analytics, or a related field. 5+ years of experience in data analysis, business intelligence, or analytics roles. Strong SQL skills with the ability to write complex queries, optimize performance, and work with large datasets. Experience with dashboarding and BI tools (Tableau, Looker, Power BI, or Mode). Understanding of data governance, data cataloging, and compliance best practices. Familiarity with ETL/ELT workflows and experience working with data engineers on pipeline improvements. Experience with cloud cost monitoring tools and optimizing query performance and storage costs. Strong problem-solving and analytical skills, with a keen eye for data accuracy and visualization best practices. Excellent communication skills, with the ability to present insights to both technical and non-technical stakeholders. Good to Have: Experience with cloud data platforms such as AWS Snowflake, or Databricks. Familiarity with Python for data analysis. Experience working with data cataloging and metadata management tools ( Alation, Databricks Unity Catalog). About The Team The Data Services team is focused on building robust, scalable, and efficient data services, pipelines, lakes, tools, libraries, and software components that drive best practices and empower service teams to succeed. We design and implement data services, pipelines and data lakes for operational analysis, ensuring they are production-ready with automation and standardized access controls. We also provide data insights and tools to our users to be able to analyze the data. We lead and evangelize the principle of 100% automation. Additionally, we establish best practices, provide training, and document data architectures to ensure our systems are reliable, future-ready, and capable of continuous evolution and process improvement across the organization. About Our Values We want to be everything for someone rather than something for everyone and we do this by living and modeling our values in all that we do. We value Courage. We believe that when we overcome fear, we enable our best selves. Curiosity. We are curious, which is the gateway to empathy, inclusion, and understanding. Kaizen. We have a growth mindset committed to constant forward progress. Service. We serve our community with humility, enabling joy and belonging for others. Our commitment to diversity and inclusion Our mission of helping people belong reflects our commitment to diversity & inclusion. It's just the way we do business. We are an equal opportunity employer and value diversity at Crunchyroll. Pursuant to applicable law, we do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Crunchyroll, LLC is an independently operated joint venture between US-based Sony Pictures Entertainment, and Japan's Aniplex, a subsidiary of Sony Music Entertainment (Japan) Inc., both subsidiaries of Tokyo-based Sony Group Corporation. Questions about Crunchyroll’s hiring process? Please check out our Hiring FAQs: https://help.crunchyroll.com/hc/en-us/articles/360040471712-Crunchyroll-Hiring-FAQs Please refer to our Candidate Privacy Policy for more information about how we process your personal information, and your data protection rights: https://tbcdn.talentbrew.com/company/22978/v1_0/docs/spe-jobs-privacy-policy-update-for-crpa-dec-21-22.pdf Please beware of recent scams to online job seekers. Those applying to our job openings will only be contacted directly from @crunchyroll.com email account.

Posted 1 week ago

Apply

0 years

0 Lacs

Greater Nashik Area

On-site

Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Data Scientist – GBS Commercial Location: Bengaluru Reporting to: Manager - GBS Commercial BUSINESS ENVIRONMENT Purpose of the role Use statistical and mathematical techniques to measure the impact of Promotions, Trade Programs, or Activations through Tests. Maintain python programming codes for the model which runs Test vs Control methodology Conduct exploratory studies to identify opportunities to increase ROI for future promotions. Optimize Sales package – Check spends Enhance, maintain python codes for the model which runs Test vs Control Strategy Design and deploy mathematical models to create Store level target lists to implement optimal ROI promotions in the future. Build recommendation models. Add features on top of the model to create robust algorithm. Communicate analysis results to business stakeholders through intuitive Power BI solutions. Setting framework for analysis based on Business Objective, thorough with Business level presentations. Build new AI & DS Products, Projects and effective Business Presentations and contribute to Business Target Achievements. Key tasks & accountabilities Expertise on Python to prepare & update algorithm. Developing business-friendly presentations, transforming thoughts into key actions for business and showing the model to justify recommendations. Expertise on Power BI to produce dashboards. Understanding of SQL to connect Power BI to Datalake & create automated outputs/Dashboards. Understanding the business problem and translating it into an analytical problem. Ability to solve problem quickly and effectively applying logical thinking and creative mindset. Handling large data sets to connect data points and provide insights that drive business. Be assertive and goal oriented and drive results with minimal support. Skillset to convert unstructured data into structured format which optimizes time spent on data handling. Hands-on experience in ML concepts like forecasting, clustering, regression, classification, optimization, deep learning Product building experience would be a plus. Interact with and present to Senior internal or external stakeholders around project/ service conceptualization and plan of delivery. Ability to work in collaboration with multiple teams within GBS Commercial tower Develop deeper understanding related to different concepts of Category Management. Plan and implement different projects with multiple teams in Category Management. Delivering insights and provocation as per timelines given Following processes and adhering to documentation goal High quality presentations Questions from business stakeholders answered satisfactorily within agreeable time. Coming up with provocation, proposing what if or next level Challenges: Primary challenges include various sources of data, incomplete data which can throw off the analysis. Outliers to be removed and data to be cleaned before analyzing and presenting to stakeholders. Qualifications, Experience, Skills Level Of Educational Attainment Required Master Graduate in the field of Business & Marketing, Engineering/Solution or other equivalent degree or equivalent work experience MBA/Engg. in a relevant technical field such as Marketing Previous Work Experience Required 2-3 and up years’ experience handling Data science Projects Prior experience in managing multiple files, data cleaning, and maintaining data in structured formats Technical Skills Required Proficient in Python, Power BI, SQL, VBA, Advanced Excel, MS Office Expert level proficiency in Python(knowledge of writing end-to-end ML or data pipelines in python) Proficient in application of AI & ML concepts and optimization techniques to solve end-to-end business problems Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform Good understanding of Statistical and Mathematical concepts And above all of this, an undying love for beer! We dream big to create future with more cheers .

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Your Team Responsibilities We are seeking an outstanding Software Engineer (Python + Database + Cloud) to join our ESG Application Development team in the Pune office. As part of a global team the candidate will need to develop productive working relationships. Your Key Responsibilities Deliver new functionalities for Operations platform and for Nextgen applications for the ESG Business. Monitor and optimize application performance. Closely collaborate with Product Management, Quality Assurance, Data Operation and IT Infrastructure on all stages of software development life cycle. Having very good Hands-On working experience in Python and Fast API/Flask/Django or similar Rest API based Frameworks. Should have experience of databricks, spark, DBT, Airflow. Should have worked on at least one end to end Development project from scratch. Should be familiar with operational aspects of Python like managing virtual environments, conda environments. Should be familiar with deploying Python code as containerized application, like Docker Container. Should be familiar with dependency management in Python. The candidate should have excellent problem solving and debugging skills. The candidate is expected to have excellent communication and good leadership skills. Working experience in Azure/GCP cloud. Your Skills And Experience That Will Help You Excel Bachelor’s degree in computer science or related disciplines preferred. 5+ years of enterprise software product development experience. Good hands-on working Experience in Python. Proficient in RESTful Web Services using Fast API, Django or Flask. Good Understanding of SQL/NoSQL Databases like Oracle/PosgreSQL, Azure Cosmos DB/MongoDB etc. Big Data technologies like databricsk/pyspark Good understanding of Unit Testing Framework like pytest/unittest. Should be clear with TDD and BDD approaches. Good Knowledge of Azure and Azure Native Libraries. Familiarity with some ORM (Object Relational Mapper) libraries like SQL Alchemy will be a plus. Experience of working with Agile, DevOps process and toolset, JIRA and GIT tools. Delivering on time and with quality. Clean Code, Best Quality Standards/Practices Excellent knowledge of OOPS concepts, Software Design and Algorithms. Great interpersonal skills. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 week ago

Apply

15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Director Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview Mastercard is looking for a talented Software Engineering Director to join the Foundry New Product development team in our Pune location. In this role you will be a leading a highly agile team building exiting and innovative products delivered at scale to global markets. Our team is built on a foundation of exploration and development of new products, mining innovation internally, developing new product lines with emerging technology, managing new products from inception to market validation to shape the future of commerce with and for our customers. At Mastercard you will help define the future of commerce globally. This team will have a diverse focus both in terms of geography and variety of technology challenges driving hard to bring innovative payment solutions to market. Role  Manage a team of highly skilled engineers and deliver software development projects using agile methodologies  Defines requirements for new applications and customizations, adhering to standards, processes and best practices  Research, design and document solutions that can scale globally  Responsible for the project management, reporting and execution of software projects  Lead and/or take part in external and internal events  Strong passion for innovation and new technologies.  Assess and contribute towards technology architecture and design  Formally supervises and coaches a large group/team, or several Leaders/ Consultants and is responsible for business execution of goals and objectives.  Ensures own teams compliance to goal setting and performance appraisal process  Helps identify and coach top talent within own teams (includes direct reports and second-line reports)  Provides strategic leadership related to specific applications and systems, or software-development methodologies  Creates and sustains an environment of ingenuity and creativity and challenges the status quo to encourage innovation  Oversees the management of the Software Engineering function and acts as an authority on high-level and complex decisions within the function Experiences Required Qualifications: Bachelor’s degree in Information Technology, Computer Science, or a related field. 15+ years of software development experience with a strong track record of delivering enterprise-grade solutions. IT experience with demonstrated thought-leadership and functional influence and partnership demonstrated by a successful track record of enabling business through these technical decisions. Considered a thought leader and expert in Software Engineering as with experience in related IT disciplines Desired strong knowledge of card ecosystem including, Loyalty, Clearing, Fraud, Disputes, Issuer, Acquirer & Merchant domains. Deep understanding of software engineering concepts, methodologies, and Agile/SAFe Agile practices. Proven expertise in software architecture, design, and application development. Strong communication skills (verbal and written) and the ability to quickly learn and apply new technologies and frameworks. High energy, detail-oriented, proactive, and capable of working under pressure to meet deadlines. Strong collaboration and organizational skills with a high degree of initiative and self-motivation. Ability to work effectively in a matrixed, geographically distributed team environment. Excellent problem solving skills and a proactive approach to problem-solving. Solid understanding of high-performing secure applications with a strong grasp of architecture, performance, and security principles. Proficiency in Java-based systems and services, cloud technologies (Azure/AWS), AI/ML, Gen AI and microservices architecture. Expertise in Spring, RESTful services, API design principles, and best practices. Familiarity with data modeling, database design, data warehousing, Oracle, Redis, and reporting technologies. Strong knowledge of Mastercard privacy by design Knowledge of enterprise-level application frameworks and tools. Big Data & Analytics Expertise: Experience designing and architecting high-volume data systems using technologies such as Hadoop, Snowflake, Databricks, and other modern data platforms. Strong understanding of data pipelines, ETL/ELT processes, and distributed data processing frameworks (e.g., Spark, Hive). Ability to build scalable, fault-tolerant data architectures that support real-time and batch analytics. Familiarity with data governance, data quality, and security best practices in large-scale data environments. All About You The ideal candidate for this position should have: Loves creating innovative products and technology solutions in a collaborative fun environment Advanced knowledge and understanding of modern software engineering concepts and methodologies is required. Strong leadership and people management skills. Strong analytical and problem-solving skills. Ability to quickly learn and implement new technologies, frameworks, and tools. Experience in product development and partnering with business teams to build the best solutions for our customers. Ability to support multiple concurrent activities and to interface with external / internal resources, working as a member of a geographically distributed project team. Strong communication skills -- both verbal and written. Strong relationship, collaboration skills and organizational skills Be skilled at explaining technical problems succinctly and clearly. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-252594

Posted 1 week ago

Apply

7.0 - 14.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Bachelor's or master’s degree in computer science, Information Technology, or a related field. Demonstrated expertise in working with DataBricks for data processing and analysis. Experience in implementing and optimizing data solutions using multiple programming languages in an Azure environment. 7-14 years' experience in Data Modelling, hands-on experience in developing data pipelines and managing data infrastructure within Azure. Python and SQL experience required Worked on end-to-end Data Product deployment Hands on Data Engineering, Data Modelling, ADF, ADL, Python, pyspark Digital logic: it is important to possess this skill to clean and organize an unstructured set of data. Computer architecture and organization: A solid understanding of computer architecture and organization will enable you to maximize efficiency when working with data. Data representation: This allows for easier gathering, manipulation, and analysis of data, which can save valuable time and money. Memory architecture: The most important part of memory architecture is being able to find the method that best combines speed, durability, reliability, and cost-effectiveness while not compromising the integrity of the data. Familiarity with Erwin modeling tool. Adapt to new modeling methods. SQL language and its implementation. Sufficient experience using database systems: Relational Database Management Systems (RDBMS) that possess big data handling capabilities, such as the ability to quickly store and fetch data.

Posted 1 week ago

Apply

12.0 years

1 - 3 Lacs

Hyderābād

Remote

Overview: As Data Modelling Assoc Manager, you will be the key technical expert overseeing data modeling and drive a strong vision for how data modelling can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data modelers who create data models for deploying in Data Foundation layer and ingesting data from various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data modelling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will independently be analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be a key technical expert performing all aspects of Data Modelling working closely with Data Governance, Data Engineering and Data Architects teams. You will provide technical guidance to junior members of the team as and when needed. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities: Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, Data Bricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Advocates existing Enterprise Data Design standards; assists in establishing and documenting new standards. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the data science team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications: 12+ years of overall technology experience that includes at least 6+ years of data modelling and systems architecture. 6+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 6+ years of experience developing enterprise data models. 6+ years in cloud data engineering experience in at least one cloud (Azure, AWS, GCP). 6+ years of experience with building solutions in the retail or in the supply chain space. Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models). Fluent with Azure cloud services. Azure Certification is a plus. Experience scaling and managing a team of 5+ data modelers Experience with integration of multi cloud services with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata, or Snowflake. Experience with version control systems like GitHub and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring, hiring and scaling data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to lead others without direct authority in a matrixed environment. Differentiating Competencies Required Ability to work with virtual teams (remote work locations); lead team of technical resources (employees and contractors) based in multiple locations across geographies Lead technical discussions, driving clarity of complex issues/requirements to build robust solutions Strong communication skills to meet with business, understand sometimes ambiguous, needs, and translate to clear, aligned requirements Able to work independently with business partners to understand requirements quickly, perform analysis and lead the design review sessions. Highly influential and having the ability to educate challenging stakeholders on the role of data and its purpose in the business. Places the user in the centre of decision making. Teams up and collaborates for speed, agility, and innovation. Experience with and embraces agile methodologies. Strong negotiation and decision-making skill. Experience managing and working with globally distributed teams

Posted 1 week ago

Apply

3.0 years

7 - 9 Lacs

Hyderābād

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: Design and implement data solutions and data models using Azure Data Lake to support Data Warehouse, Data Lake, and Lakehouse architectures, ensuring seamless integration with Azure Fabric services. Develop and manage data ingestion pipelines for both batch and streaming data using Azure Data Fabric to ensure efficient and reliable data flow. Responsibilities: -Experience with Apache Spark / PySpark for data processing and optimization, within Azure Fabric environments. -Apply data governance best practices using Azure Purview, including metadata management, data cataloging, and lineage tracking, to ensure compliance and effective data management within Azure Fabric ecosystems. -Utilize Azure Fabric's Toolbox and Metadata-Driven Ingestion & Processing accelerators to enhance data processing workflows and improve efficiency. Mandatory skill sets: Perform data migration from legacy databases or other cloud platforms to Azure Fabric, leveraging Azure Migrate and other Azure-native migration tools. Collaborate with source system owners to integrate data from multiple source databases, making use of Azure Fabric's data integration capabilities to ensure seamless data consolidation. Preferred skill sets: 3+ years of experience in data engineering, with hands-on experience in Azure Fabric. Good understanding of Lakehouse architecture, OneLake, and Microsoft Fabric components. Strong expertise in Spark/PySpark, and Azure SQL-based solutions,Azure Data Factory, Azure Databricks. Strong experience in data migration strategies involving legacy or cloud-native data sources. Years of experience required: 5 to 10 years experience req. Education qualification: B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc.) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Fabric Design Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

8.0 years

8 - 10 Lacs

Hyderābād

On-site

FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Lead Software Engineer-FullStack (Python & VueJS/ReactJS) Group Description Data Solutions - Platforms and Environments department manages an industry leading content delivery platform. Clients seamlessly access organized and connected content that is easily discoverable, explorable, and procured via the FactSet Marketplace. Data is delivered via a variety of technologies and formats that meet the needs of our clients’ workflows. By enabling our clients to utilize their preferred choice of industry standard databases, programing languages, and data visualization tools, we empower them to focus on the core competencies needed to drive their business. The Data Solutions - Platforms and Environments solutions portfolio includes Standard DataFeed, Data Exploration, OnDemand (API), Views, Cornerstone, Exchange DataFeed, Benchmark Feeds, the Open:FactSet Marketplace, DataDictionary , Navigator and other non-workstation initiatives. Job Description The Data Solutions - Platforms and Environments team is looking for a talented, highly motivated Lead Software Engineer (Full Stack) to join our Platforms and Environments Development team, an important part of one of FactSet’s highest profile and most strategic areas of investment and development. As the Full Stack Lead Software Engineer, you will design and develop Applications including UI, API, Database frameworks and data engineering pipelines, help implement improvements to existing pipelines and infrastructure and provide production support. You will be collaborating closely with Product Developer/Business Analyst for capturing technical requirements. FactSet is happy to setup an information session with an Engineer working on this product to talk about the product, team and the interview process. What You’II Do Architect new components and Application Features for Client facing applications as a Full Stack Developer. Maintain and resolve bugs in existing components Contribute new features, fixes, and refactors to the existing code Perform code reviews and coach engineers with respect to best practices Work with other engineers in following the test-driven methodology in an agile environment Collaborate with other engineers and Product Developers in a Scrum Agile environment using Jira and Confluence Ability to work as part of a geographically diverse team Ability to create and review documentation and test plans Estimate task sizes and regularly communicate progress in daily standups and biweekly Scrum meetings Coordinate with other teams across offices and departments What We’re Looking For Master's or bachelor’s degree in engineering or relevant field required. 8+ years of relevant experience Experience in architecting of distributed engineering applications and data pipelines Expert level proficiency in writing and optimizing code in Python . Proficient in frontend technologies such as Vue.js (preferred) or ReactJS and experience with JavaScript, CSS, HTML . Expert knowledge of REST API Development, preferably Python Flask, Open API Working knowledge of Relational databases, preferably with MSSQL or Postgres Knowledge of Generative AI and Vector Databases is a huge plus Good understanding of general database design and architecture principles High level knowledge of datalake houses like Snowflake and Databricks is a plus A realistic, pragmatic approach. Can deliver functional prototypes that can be enhanced & optimized in later phases Strong written and verbal communication skills Working experience on AWS services, Lambda, EC2, S3, AWS Glue etc. Strong Working experience with any container / PAAS technology (Docker or Heroku) ETL and Data pipelines experience a plus. Working experience of Apache Spark, Apache Airflow, GraphQL, is a plus Experience in developing event driven distributed serverless Infrastructure (AWS-Lambda), SNS-SQS is a plus. Must be a Voracious Learner. What's In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet ( NYSE:FDS | NASDAQ:FDS ) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.

Posted 1 week ago

Apply

8.0 - 10.0 years

6 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Principal IS Bus Sys Analyst, Neural Nexus What you will do Let’s do this. Let’s change the world. In this vital role you will support the delivery of emerging AI/ML capabilities within the Commercial organization as a leader in Amgen's Neural Nexus program. We seek a technology leader with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders. Are you interested in building a team that consistently delivers business value in an agile model using technologies such as AWS, Databricks, Airflow, and Tableau? Come join our team! Roles & Responsibilities: Establish an effective engagement model to collaborate with the Commercial Data & Analytics (CD&A) team to help realize business value through the application of commercial data and emerging AI/ML technologies. Serve as the technology product owner for the launch and growth of the Neural Nexus product teams focused on data connectivity, predictive modeling, and fast-cycle value delivery for commercial teams. Lead and mentor junior team members to deliver on the needs of the business Interact with business clients and technology management to create technology roadmaps, build cases, and drive DevOps to achieve the roadmaps. Help to mature Agile operating principles through deployment of creative and consistent practices for user story development, robust testing and quality oversight, and focus on user experience. Become the subject matter expert in emerging technology capabilities by researching and implementing new tools and features, internal and external methodologies. Build expertise and domain expertise in a wide variety of Commercial data domains. Provide input for governance discussions and help prepare materials to support executive alignment on technology strategy and investment. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 8 - 10 years of experience in Information Systems experience OR Bachelor’s degree with 10 - 14 years of experience in Information Systems experience OR Diploma with 14 - 18 years of experience in Information Systems experience Excellent problem-solving skills and a passion for tackling complex challenges in data and analytics with technology Experience leading data and analytics teams in a Scaled Agile Framework (SAFe) Good interpersonal skills, good attention to detail, and ability to influence based on data and business value Ability to build compelling business cases with accurate cost and effort estimations Has experience with writing user requirements and acceptance criteria in agile project management systems such as Jira Ability to explain sophisticated technical concepts to non-technical clients Good understanding of sales and incentive compensation value streams Technical Skills: ETL tools: Experience in ETL tools such as Databricks Redshift or equivalent cloud-based dB Big Data, Analytics, Reporting, Data Lake, and Data Integration technologies S3 or equivalent storage system AWS (similar cloud-based platforms) BI Tools (Tableau and Power BI preferred) Preferred Qualifications: Jira Align & Confluence experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Understanding of software systems strategy, governance, and infrastructure Experience in managing product features for PI planning and developing product roadmaps and user journeys Familiarity with low-code, no-code test automation software Technical thought leadership Soft Skills: Able to work effectively across multiple geographies (primarily India, Portugal, and the United States) under minimal supervision Demonstrated proficiency in written and verbal communication in English language Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Intellectual curiosity and the ability to question partners across functions Ability to prioritize successfully based on business value High degree of initiative and self-motivation Ability to manage multiple priorities successfully across virtual teams Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

0 years

6 - 10 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will play a key role in successfully leading the engagement model between Amgen's Technology organization and Global Commercial Operations. Collaborate with G&A (Finance, HR, Legal, IT etc.) Business SMEs, Data Engineers, Data Scientists and Product Managers to lead business analysis activities, ensuring alignment with engineering and product goals on the Data & AI Product Teams Become a G&A (Finance, HR, Legal, IT etc.) domain authority in Data & AI technology capabilities by researching, deploying, and sustaining features built according to Amgen’s Quality System Lead the voice of the customer assessment to define business processes and product needs Work with Product Managers and customers to define scope and value for new developments Collaborate with Engineering and Product Management to prioritize release scopes and refine the Product backlog Ensure non-functional requirements are included and prioritized in the Product and Release Backlogs Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team Clearly express features in User Stories/requirements so all team members and partners understand how they fit into the product backlog Ensure Acceptance Criteria and Definition of Done are well-defined Work closely with Business SME’s, Data Scientists, ML Engineers to understand the requirements around Data product requirements, KPI’s etc. Analyzing the source systems and create the STTM documents. Develop and implement effective product demonstrations for internal and external partners Maintain accurate documentation of configurations, processes, and changes Understand end-to-end data pipeline design and dataflow Apply knowledge of data structures to diagnose data issues for resolution by data engineering team What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Principal IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: 12 to 17 years of experience in G&A (Finance, HR, Legal, IT etc.) Information Systems Mandatory work experience in acting as a business analyst in DWH, Data product building, BI & Analytics Applications. Experience in Analyzing the requirements of BI, AI & Analytics applications and working with Data Source SME, Data Owners to identify the data sources and data flows Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Preferred Qualifications: Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

3.0 years

8 - 10 Lacs

Hyderābād

On-site

FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate , serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Job Description FactSet creates flexible, open data and software solutions for over 200,000 investment professionals worldwide, providing instant access to financial data and analytics that investors use to make crucial decisions. At FactSet, our values are the foundation of everything we do. They express how we act and operate, serve as a compass in our decision-making, and play a big role in how we treat each other, our clients, and our communities. We believe that the best ideas can come from anyone, anywhere, at any time, and that curiosity is the key to anticipating our clients’ needs and exceeding their expectations. Your Team Impact FactSet’s Data Solutions organization is seeking a motivated Software Engineer III to join the Usage and Consumption Pricing initiativ e , an exciting and growing area at the company . This team is responsible for building vital tools to understand client usage , instrumentation of critical applications and infrastructures to source usage data, and p ricing c alculators that create quotes based on metered usage . This indiv idual must be adaptable to learning and working with a wide breadth of technologies ranging from Big Data ETL and analytics to traditional on-premise Linux server programming . What You'll do Collaborate with internal engineering groups to collect usage Collaborate with internal sales and strategy groups to report usage Apply multiple styles of testing techniques to deliver reliable software and data to our clients and stakeholders Follow best practices for runtime, on-call support, and deployment procedures Engage with cross-functional peers on a Scrum/Agile team and communicate with stakeholders regarding demos of delivered projects, status updates, challenges, and obstacles. What we're looking for Bachelor’s Degree or equivalent in Computer Science or related field At least 3 years of experience as a Software Engineer Solid understanding of systems design, data structures, and algorithms A realistic, pragmatic approach, encouraging prototyping and iterative development Relational database experience such as MSSQL or PostgreSQL Experience with AWS technologies such as S3, Lambda functions, ECS Fargate, EC2, etc. Experience with Python, Java, and REST APIs Strong written and verbal communication skills Desired Skills Familiarity with infrastructure-as-code (IaC,) especially Terraform Experience with Big Data products and technologies such as DataBricks, Snowflake, Athena, ETL pipelines, etc. Experience working with a Scrum/Agile team Experience with C++, Linux What's In It For You At FactSet, our people are our greatest asset, and our culture is our biggest competitive advantage. Being a FactSetter means: The opportunity to join an S&P 500 company with over 45 years of sustainable growth powered by the entrepreneurial spirit of a start-up. Support for your total well-being. This includes health, life, and disability insurance, as well as retirement savings plans and a discounted employee stock purchase program, plus paid time off for holidays, family leave, and company-wide wellness days. Flexible work accommodations. We value work/life harmony and offer our employees a range of accommodations to help them achieve success both at work and in their personal lives. A global community dedicated to volunteerism and sustainability, where collaboration is always encouraged, and individuality drives solutions. Career progression planning with dedicated time each month for learning and development. Business Resource Groups open to all employees that serve as a catalyst for connection, growth, and belonging. Salary is just one component of our compensation package and is based on several factors including but not limited to education, work experience, and certifications. Company Overview: FactSet ( NYSE:FDS | NASDAQ:FDS ) helps the financial community to see more, think bigger, and work better. Our digital platform and enterprise solutions deliver financial data, analytics, and open technology to more than 8,200 global clients, including over 200,000 individual users. Clients across the buy-side and sell-side, as well as wealth managers, private equity firms, and corporations, achieve more every day with our comprehensive and connected content, flexible next-generation workflow solutions, and client-centric specialized support. As a member of the S&P 500, we are committed to sustainable growth and have been recognized among the Best Places to Work in 2023 by Glassdoor as a Glassdoor Employees’ Choice Award winner. Learn more at www.factset.com and follow us on X and LinkedIn . At FactSet, we celebrate difference of thought, experience, and perspective. Qualified applicants will be considered for employment without regard to characteristics protected by law.

Posted 1 week ago

Apply

5.0 - 9.0 years

7 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Data Engineer What you will do Let’s do this. Let’s change the world. In this vital role will be a key contributor to the Clinical Trial Data & Analytics (CTDA) Team, driving the development of robust data pipelines and platforms to enable advanced analytics and decision-making. Operating within a SAFE Agile product team, this role ensures system performance, minimizes downtime through automation, and supports the creation of actionable insights from clinical trial data. Collaborating with product owners, architects, and engineers, the Data Engineer will implement and enhance analytics capabilities. Ideal candidates are diligent professionals with strong technical skills, a problem-solving approach, and a passion for advancing clinical operations through data engineering and analytics. Roles & Responsibilities: Proficiency in developing interactive dashboards and visualizations using Spotfire, Power BI, and Tableau to provide actionable insights. Expertise in creating dynamic reports and visualizations that support data-driven decision-making and meet collaborator requirements. Ability to analyze complex datasets and translate them into meaningful KPIs, metrics, and trends. Strong knowledge of data visualization standard methodologies, including user-centric design, accessibility, and responsiveness. Experience in integrating data from multiple sources (databases, APIs, data warehouses) into visualizations. Skilled in performance tuning of dashboards and reports to optimize responsiveness and usability. Ability to work with end-users to define reporting requirements, develop prototypes, and implement final solutions. Familiarity with integrating real-time and predictive analytics within dashboards to enhance forecasting capabilities. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years' of experience in Computer Science/IT or related field Must-Have Skills: Proven hands-on experience with cloud platforms such as AWS, Azure, and GCP. Proficiency in using Python, PySpark, and SQL, with practical experience in ETL performance tuning. Development knowledge in Databricks. Strong analytical and problem-solving skills to tackle complex data challenges, with expertise in using analytical tools like Spotfire, Power BI, and Tableau. Preferred Qualifications: Good-to-Have Skills: Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Familiarity with SQL/NOSQL database, vector database for large language models Familiarity with prompt engineering, model fine tuning Professional Certifications AWS Certified Data Engineer (preferred) Databricks Certification (preferred) Any SAFe Agile certification (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

WorkMode :Hybrid Work Location : Chennai / Hyderabad / Bangalore / Pune / mumbai / gurgaon Work Timing : 2 PM to 11 PM Primary : Data Engineer AWS Data Engineer - AWS Glue, Amazon Redshift, S3 ETL Process , SQl, Databricks JD Examining the business needs to determine the testing technique by automation testing. Maintenance of present regression suites and test scripts is an important responsibility of the tester. The testers must attend agile meetings for backlog refinement, sprint planning, and daily scrum meetings. Testers to execute regression suites for better results. Must provide results to developers, project managers, stakeholders, and manual testers. Responsibility AWS Data Engineer Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and S3. Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes. Create and manage applications using Python, SQL, Databricks, and various AWS technologies. Automate repetitive tasks and build reusable frameworks to improve efficiency.

Posted 1 week ago

Apply

0 years

3 - 9 Lacs

Hyderābād

On-site

We are seeking a skilled Agentic AI Developer to design and implement intelligent agent systems powered by Large Language Models (LLMs) . This role involves developing LLM-based pipelines that can ingest transcripts, documents, or business narratives and generate structured artifacts such as workflows, decision trees, action plans, or contextual recommendations. You will collaborate with cross-functional teams to deploy autonomous AI agents capable of reasoning, planning, memory, and tool usage in enterprise environments — primarily within the Microsoft ecosystem (Azure, Power Platform, Copilot, and M365 integrations). Key Responsibilities Build and deploy autonomous agent systems using frameworks such as LangChain, AutoGen, CrewAI, or Semantic Kernel. Develop pipelines to process natural language input and generate structured outputs tailored to business needs. Implement agentic features such as task orchestration, memory storage, tool integration , and feedback loops. Fine-tune LLMs or apply prompt engineering to optimize accuracy, explainability, and responsiveness. Integrate agents with Microsoft 365 services (Teams, Outlook, SharePoint) and Power Platform components (Dataverse, Power Automate). Collaborate with business and product teams to define use cases, test scenarios, and performance benchmarks. Participate in scenario-based UAT testing, risk evaluation, and continuous optimization. Must-Have Skills Proficiency in Python and hands-on experience with ML/AI libraries and frameworks (Transformers, PyTorch, LangChain). Strong understanding of LLMs (e.g., GPT, Claude, LLaMA, Mistral) and prompt engineering principles. Experience developing agent workflows using ReAct, AutoGen, CrewAI, or OpenAI function calling . Familiarity with Vector Databases (FAISS, Pinecone, Qdrant) and RAG-based architectures . Skills in Natural Language Processing (NLP) : summarization, entity recognition, intent classification. Integration experience with APIs, SDKs , and enterprise tools (preferably Microsoft stack). Preferred Certifications (Candidates with the following certifications will have a strong advantage) : ✅ Microsoft Certified: Azure AI Engineer Associate (AI-102) ✅ Microsoft Certified: Power Platform App Maker (PL-100) ✅ Microsoft 365 Certified: Developer Associate (MS-600) ✅ OpenAI Developer Certifications or Prompt Engineering Badge ✅ Google Cloud Certified: Professional Machine Learning Engineer ✅ NVIDIA Deep Learning Institute Certifications ✅ Databricks Generative AI Pathway (optional)

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies