Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
20 - 25 Lacs
Kolkata
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Requirement gathering and analysis Design of data architecture and data model to ingest data Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Hands on experience on Azure functions and other components like realtime streaming etc Oversee Azure billing processes, conducting analyses to ensure cost-effectiveness and efficiency in data operations. Provide optimized solution for any problem related to data engineering Ability to work with verity of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Mandatory skill sets: SQL, ADF, ADLS, Synapse, Pyspark, Databricks, data modelling Preferred skill sets: Pyspark, Databricks Years of experience required: 7 - 10 yrs Education qualification: B.tech/MCA and MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship Government Clearance Required Job Posting End Date
Posted 1 week ago
6.0 - 9.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Get to Know the Team: At Grabber Technology Solutions (GTS), we revolutionise the technology experience for every Grabber. Our mission is to empower our team with seamless and solutions that enhance their daily work. We are a diverse group of forward-thinkers committed to creating personalised IT experiences. If youre passionate about customer-centric innovation and technology at Grab, come join us and help shape the future of technology! Get to Know the Role: We are looking for an experienced Senior Specialist, Configuration Management to drive the accuracy, integrity, and strategic value of our Configuration Management Database (CMDB). This important individual contributor role will be the primary owner and performer of CMDB operations, ensuring it serves as the definitive source of truth for our IT landscape. You understand configuration management mechanics, including the seamless integration of hardware and software assets within the CMDB framework. You will report to Manager II, IT Service Transition. This role is based in Bangalore. The Critical Tasks You will Perform: Own and maintain the Configuration Management Database (CMDB), ensuring accuracy and completeness by collaborating with cross-functional teams on Configuration Item (CI) identification, documentation, and lifecycle management. Lead and evolve Software Asset Management (SAM) processes, defining inclusive policies, tools, and procedures for licence tracking, compliance, usage, and optimisation. Identify and implement opportunities to streamline and automate Configuration Management processes within the ITSM platform, ensuring seamless integration with core ITSM functions like Change, Incident, Problem, and Release Management. Generate regular reports and KPIs, conduct configuration audits, and support risk assessments to address discrepancies and ensure compliance. Provide expert support for Change Management processes, contributing to accurate and collaborative impact assessments for changes affecting configurations. Stay current with industry trends and emerging technologies, recommending strategic process and tool improvements to enhance Configuration and Asset Management practices. Read more Skills you need What Essential Skills You will Need: Bachelors degree in Computer Science, Information Technology, or a related field 6 to 9 years hands-on experience in IT Operations, Service Management or Configuration Management roles. Deep, hands-on expertise in configuration management principles and practices, including CMDB data modelling, CI lifecycle, relationships and data quality. Track record in defining and implementing Hardware Asset Management (HAM) and Software Asset Management (SAM) processes, policies and tools. Hands-on experience with automated discovery and reconciliation tools and integrating data from multiple IT systems. Demonstrated experience defining and generating reports on KPIs and building data visualisations. Good to have ITIL Expert (v3/v4) certified COBIT 5 Foundation certified Lean/SixSigma certified Read more What we offer About Grab and Our Workplace Grab is Southeast Asias leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, weve got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Read more Life at Grab Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave , and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through lifes challenges. What We Stand For at Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. Read more
Posted 1 week ago
8.0 - 12.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Job Title: Lead, Analytics & Operations Strategy About Skyhigh Security: Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the world s data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on LinkedIn and Twitter @SkyhighSecurity . Role Overview: Are you a data expert who sees beyond the numbers to the story they tellDo you thrive on transforming complex data into strategic insights that drive business decisionsWe are looking for an Analytics & Operations Strategy Lead to join our team and become a pivotal voice in shaping our company s direction. You will be instrumental in driving our data-driven decision-making and operational excellence. Youll be responsible for unifying our analytics and operations efforts, fostering cross-functional collaboration, and developing scalable solutions that impact the entire organization. What Youll Do Tell Stories with Data: Transform complex data into clear, compelling narratives that inform business strategy and drive action. Develop and present insightful reports, dashboards, and presentations to leadership and various teams. Automate and Scale Analytics & Operations: Design, build, and maintain robust and scalable analytics solutions. You will champion the automation of processes, implement scalable solutions, and empower stakeholders with self-service access to critical data. Drive Strategic Alignment: Act as a critical thought partner to cross-functional teams, including Product, Marketing, Sales, and Engineering. You will use your analytical expertise to understand their challenges, identify opportunities, and build consensus on strategic initiatives. Mentor and Lead Junior Team Members: Provide guidance, mentorship, and support to junior analysts and operations specialists. Foster a culture of continuous learning, professional development, and high performance within the team. Build Trust in Our Data: Take ownership of our data quality and integrity. You will be a key player in developing and implementing data governance best practices, ensuring our datasets are accurate, reliable, and trusted as the single source of truth. Deep Dive Analysis: Conduct sophisticated exploratory analysis to identify key business trends, challenges, and opportunities. Your work will form the foundation of our strategic planning and decision-making processes. Qualifications 8 to 12 years of experience in data analytics, business intelligence, and operations roles, with a proven track record of driving impact. Bachelors degree in a quantitative field (e.g., Business Analytics, Computer Science, Statistics, Economics, Engineering) or equivalent practical experience. Masters degree preferred. Strong proficiency in data visualization tools (e.g., Tableau, Power BI, Looker) and advanced Excel. Proven experience in process automation and building scalable solutions. Excellent communication, presentation, and interpersonal skills with the ability to influence and collaborate effectively across all levels of the organization. Demonstrated leadership abilities, including mentoring and developing team members. Strong strategic thinking and problem-solving skills, with the ability to prioritize and manage multiple initiatives simultaneously. Preferred Qualifications Familiarity with project management methodologies (e.g., Agile, Scrum). Familiarity with advanced statistical techniques and their business applications. Experience in Cybersecurity and/or SaaS. Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each others unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement Were serious ab out our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.
Posted 1 week ago
7.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Experience in Data Warehouse, Solution Design and Data Analytics. Experience in data modelling exercises like dimensional modelling, data vault modelling Understand, Interpret, and clarify Functional requirements as well as technical requirements. Should understand the overall system landscape, upstream and downstream systems. Should be able understand ETL tech specifications and develop the code efficiently. Ability to demonstrate Informatica Cloud features/functions to the achieve best results. Hands on experience in performance tuning, pushdown optimization in IICS. Provide mentorship on debugging and problem-solving. Review and optimize ETL tech specifications and code developed by the team. Ensure alignment with overall system architecture and data flow. Mandatory skill sets: Data Modelling, IICS/any leading ETL tool, SQL Preferred skill sets: Python Years of experience required: 7 - 10 yrs Education qualification: B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ETL Tools Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship Government Clearance Required Job Posting End Date
Posted 1 week ago
6.0 - 11.0 years
18 - 20 Lacs
Bengaluru
Work from Office
Senior Software Test Engineer-GenAI Testing - Kongsberg Digital This website uses cookies to ensure you get the best experience. Kongsberg Digital and our selected partners use cookies and similar technologies (together cookies ) that are necessary to present this website, and to ensure you get the best experience of it. If you consent to it, we will also use cookies for analytics and marketing purposes. You can withdraw and manage your consent at any time, by clicking Manage cookies at the bottom of each website page. Decline all non-necessary cookies Select which cookies you accept On this site, we always set cookies that are strictly necessary, meaning they are necessary for the site to function properly. If you consent to it, we will also set other types of cookies. You can provide or withdraw your consent to the different types of cookies using the toggles below. You can change or withdraw your consent at any time, by clicking the link Manage Cookies , which is always available at the bottom of the site. These cookies are necessary to make the site work properly, and are always set when you visit the site. These cookies collect information to help us understand how the site is being used. These cookies are used to make advertising messages more relevant to you. In some cases, they also deliver additional functions on the site. Decline all non-necessary cookies Senior Software Test Engineer-GenAI Testing About the Role: We are seeking a passionate and forward-thinking Senior QA Engineer to lead quality assurance for Generative AI (GenAI) solutions embedded within our Digital Twin platform. This is a high-impact role that goes beyond traditional QA focusing on the nuanced evaluation, reliability, and guardrails of AI-powered systems in production. You will be responsible not just for testing, but also for establishing evaluation frameworks, defining AI quality benchmarks, and upskilling other QA engineers in GenAI testing methods. The ideal candidate brings a mix of structured QA discipline, hands-on familiarity with GenAI systems (LLMs, RAG, agents), and a strong sense of ownership. Key Responsibilities: Design and implement end-to-end QA strategies for applications using Node.js, integrated with LLMs, retrieval-augmented generation (RAG), and Agentic AI workflows. Establish comprehensive benchmarks and quality metrics for GenAI components including accuracy, coherence, relevance, stability, and safety. Develop structured evaluation datasheets for LLM behaviour validation: test prompts, expected responses, classification criteria, and scoring rubrics. Perform data quality testing for RAG databases and ensure relevant, high-quality retrieval to minimize hallucinations and improve grounding. Conduct A/B testing across model versions, prompt designs, and system configurations to measure and compare output quality. Define methodologies and simulate non-deterministic behaviours using Agentic AI testing techniques. Collaborate closely with developers, product owners, and AI engineers to test prompt engineering pipelines, function-calling interfaces, and fallback logic. Build QA automation where applicable and integrate GenAI evaluations into CI/CD pipelines. Lead internal capability development by mentoring QA peers on GenAI testing practices and helping evolve the organization s AI quality maturity. Required Skills and Qualifications: 6+ years of experience in software quality assurance, with at least 3+ years working in or around GenAI or LLM-based systems. Deep understanding of GenAI quality dimensions: response grounding, factual correctness, context awareness, and hallucination minimization. Experience creating and maintaining LLM evaluation datasets and designing test cases for dynamic prompt behaviour. Hands-on experience with tools and techniques for testing retrieval pipelines, embedding quality, and vector similarity results in RAG architectures. Familiarity with non-deterministic testing strategies, agent loop evaluation, and multi-step LLM task validation. Comfortable working with APIs, logs, test scripts, and tracing tools to validate both system and AI behaviour. Strong analytical thinking and a methodical approach to identifying bugs, regressions, and inconsistencies in AI outputs. Bachelor or master s degree in engineering Preferred Skills: Experience with GenAI tools/platforms like OpenAI, LangChain, Semantic Kernel, Hugging Face, Pinecone, or Weaviate. Exposure to evaluating LLMs in production settings, including safety nets, guardrails, and red-teaming approaches. Familiarity with prompt tuning, few-shot learning, and function/tool calling in LLMs. Basic scripting knowledge (Python, JavaScript, or TypeScript) for building test harnesses or validation utilities. First /Mid-Level Officials OUR POWER IS CURIOSITY, CREATION AND INNOVATION We believe you love to experiment, challenge the established, co-create, develop and cultivate. Together we can explore new answers to today s challenges and future opportunities, and talk about how industrial digitalisation can be a part of the solution for a better tomorrow. We believe that different perspectives are crucial for developing gamechanging technology for a better tomorrow. Join us in taking on this challenge! Already working at Kongsberg Digital Let s recruit together and find your next colleague.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Data Scientist Responsibilities : Data Exploration and Insights : - Conduct continuous data exploration and analysis to identify opportunities for enhancing data matching logic, including fuzzy logic, and improving overall data quality within the SCI solution. - This includes working with large datasets from various sources, including Excel files and databases. Data Quality Improvement : - Perform various analyses specifically aimed at improving data quality within the SCI system. - This will involve identifying data quality issues, proposing solutions, and implementing improvements. Weekly Playback and Collaboration : - Participate in weekly playback sessions, using Jupyter Notebook to demonstrate data insights and analysis. - Incorporate new explorations and analyses based on feedback from the working group and prioritized tasks. Project Scaling and Support : - Contribute to the scaling of the SCI project by supporting data acquisition, cleansing, and validation processes for new markets. - This includes pre-requisites for batch ingestion and post-batch ingestion analysis and validation of SCI records. Data Analysis and Validation : - Perform thorough data analysis and validation of SCI records after batch ingestion. - Proactively identify insights and implement solutions to improve data quality. Stakeholder Collaboration : - Coordinate with business stakeholders to facilitate the manual validation of records flagged for manual intervention. - Communicate findings and recommendations clearly and effectively. Technical Requirements : - 5+ years of experience as a Data Scientist. - Strong proficiency in Python and SQL. - Extensive experience using Jupyter Notebook for data analysis and visualization. - Working knowledge of data matching techniques, including fuzzy logic. - Experience working with large datasets from various sources (Excel, databases, etc. - Solid understanding of data quality principles and methodologies. Skills : - SQL - Machine Learning (While not explicitly required in the initial description, it's a valuable skill for a Data Scientist and should be included) - Data Analysis - Jupyter Notebook - Data Cleansing - Fuzzy Logic - Python - Data Quality Improvement - Data Validation - Data Acquisition - Communication and Collaboration - Problem-solving and Analytical skills Preferred Qualifications (Optional, but can help attract stronger candidates) : - Experience with specific data quality tools and techniques. - Familiarity with cloud computing platforms (e.g., AWS, Azure, GCP). - Experience with data visualization tools (e.g., Tableau, Power BI). - Knowledge of statistical modeling and machine learning algorithms
Posted 1 week ago
7.0 - 10.0 years
9 - 12 Lacs
Pune
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 1 week ago
7.0 - 10.0 years
5 - 8 Lacs
Pune
Remote
Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 1 week ago
5.0 - 10.0 years
11 - 16 Lacs
Ahmedabad
Work from Office
Data Scientist Responsibilities : Data Exploration and Insights : - Conduct continuous data exploration and analysis to identify opportunities for enhancing data matching logic, including fuzzy logic, and improving overall data quality within the SCI solution. - This includes working with large datasets from various sources, including Excel files and databases. Data Quality Improvement : - Perform various analyses specifically aimed at improving data quality within the SCI system. - This will involve identifying data quality issues, proposing solutions, and implementing improvements. Weekly Playback and Collaboration : - Participate in weekly playback sessions, using Jupyter Notebook to demonstrate data insights and analysis. - Incorporate new explorations and analyses based on feedback from the working group and prioritized tasks. Project Scaling and Support : - Contribute to the scaling of the SCI project by supporting data acquisition, cleansing, and validation processes for new markets. - This includes pre-requisites for batch ingestion and post-batch ingestion analysis and validation of SCI records. Data Analysis and Validation : - Perform thorough data analysis and validation of SCI records after batch ingestion. - Proactively identify insights and implement solutions to improve data quality. Stakeholder Collaboration : - Coordinate with business stakeholders to facilitate the manual validation of records flagged for manual intervention. - Communicate findings and recommendations clearly and effectively. Technical Requirements : - 5+ years of experience as a Data Scientist. - Strong proficiency in Python and SQL. - Extensive experience using Jupyter Notebook for data analysis and visualization. - Working knowledge of data matching techniques, including fuzzy logic. - Experience working with large datasets from various sources (Excel, databases, etc. - Solid understanding of data quality principles and methodologies. Skills : - SQL - Machine Learning (While not explicitly required in the initial description, it's a valuable skill for a Data Scientist and should be included) - Data Analysis - Jupyter Notebook - Data Cleansing - Fuzzy Logic - Python - Data Quality Improvement - Data Validation - Data Acquisition - Communication and Collaboration - Problem-solving and Analytical skills Preferred Qualifications (Optional, but can help attract stronger candidates) : - Experience with specific data quality tools and techniques. - Familiarity with cloud computing platforms (e.g., AWS, Azure, GCP). - Experience with data visualization tools (e.g., Tableau, Power BI). - Knowledge of statistical modeling and machine learning algorithms
Posted 1 week ago
5.0 - 7.0 years
10 - 14 Lacs
Ahmedabad
Work from Office
Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 1 week ago
4.0 - 9.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Cigna, a leading Health Services company, is looking for an exceptional Front-End API engineer in our Data Analytics Engineering organization. The Full Stack Engineer is responsible for the delivery of a business need starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Full Stack Engineer, among others, is ownership, eagerness to learn an open mindset. In addition to Delivery, the Full Stack Engineer should have an automation first and continuous improvement mindset. Person should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. Job Description Responsibilities : Full Stack Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. Participate in design, definition, planning, development, implementation of projects and evaluate conforming to Software Development Best Practices. Perform peer-code reviews. Responsible for the deliverable. Ask smart questions, take risks, and champion new ideas . Business oriented and able to communicate at all levels. Ensure adherence to existing strategic direction and architectural strategies. Embraces the agile delivery process by releasing business value incrementally into production. Transfer key knowledge and code ownership to the team. Mentor talent and cultivates new team members. Foster environment where business is involved in the project and aware of key decisions, issues, and functionality. Experience Required : 4 + years of React experience 4 + years of Javascript /Typescript (Nodejs) experience 4 + years of experience with SQL STRONG REACT / TYPESCRIPT / SQL 3 + years of experience on cloud technologies 3 + years being part of Agile teams Scrum Experience Desired : Experience with version management tools Git preferred. Experience with BDD and TDD development methodologies Experience working in an agile CI/CD environments; Jenkins experience preferred. Knowledge and/or experience with Health care information domains preferred Education and Training Required : Bachelor degree (or equivalent) required . Primary Skills : React , JavaScript, Typescript , SQL, Nodejs, GraphQL Additional Skills: AWS, Git, Terraform, Lambda, Design and architect the solution independently Take ownership and accountability Write referenceable modular code Be fluent in particular areas and have proficiency in many areas, Have a passion to learn. Have a quality mindset, not just code quality but also to ensure ongoing application/data quality by monitoring data to identify problems before they have business impact Take risks and champion new ideas
Posted 1 week ago
10.0 - 15.0 years
50 - 55 Lacs
Hyderabad
Work from Office
As a Director of Software Engineering at JPMorgan Chase within the Chief Data & Analytics Offices Data Governance Engineering team, you will play a pivotal role in supporting the firm in delivering services to clients and advancing the firm-wide agenda for Data & Analytics. You will lead firm-wide initiatives through a unified Data Analytics Platform, in alignment with the firms Data & AI strategy. Collaborating with engineering teams, you will be responsible for creating designs, establishing best practices, and developing guidelines along with scalable frameworks to effectively manage large volumes of data, ensuring interoperability, compliance with data classification requirements, and maintaining data integrity and accessibility. You will work closely with Product & Engineering teams to promote unified engineering execution across multiple initiatives, strategically designing and building applications that address real-world use cases. Your expertise in software, applications, technical processes, and product management will be essential in promoting complex projects and initiatives, serving as a primary decision-maker and a champion of innovation and solution delivery. As part of the Product Delivery team, you will design and build scalable cloud-native foundational data governance products and services that support the Data Risk Pillars, providing a unified experience through the CDAO platform.. Job responsibilities Collaborate with product and engineering teams to deliver robust firm wide data governance solutions that drive enhanced customer experiences. Provides critical day-to-day leadership and strategic thinking, working with team of engineers and architects to align cross-functional initiatives, ensuring they are feasible both fiscally and technically. Makes decisions that influence teams resources, budget, tactical operations, and the execution and implementation of processes and procedures. Champions the firm s culture of diversity, equity, inclusion, and respect Will be leading the consolidation and convergence effort of the Data Governance capabilities under unified CDAO Platform and other priority firmwide initiatives related to BCBS 239, data lineage, controls & data quality. Enable engineering teams to Develop, enhance, and maintain established standards and best practices, Drive, self-service, and deliver on a strategy to operate on a build broad use of Amazons utility computing web services (eg, AWS EC2, AWS S3, AWS RDS, AWS CloudFront, CloudWatch, EKS) Identify opportunities to improve resiliency, availability, secure, high performing platforms in Public Cloud using JPMC best practices. Implement continuous process improvement, including but not limited to policy, procedures, and production monitoring and reduce time to resolve. Identify, coordinate, and implement initiatives/projects and activities that create efficiencies and optimize technical processing. Measure and optimize system performance, with an eye toward pushing our capabilities forward, getting ahead of customer needs, and innovating to continually improve. Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Graph DB and Open Source RDBMs databases, Container Orchestration services including Kubernetes, and a variety of AWS tools and services. Required qualifications, capabilities, and skills. Formal training or certification on software engineering concepts and 10+ years applied experience. In addition, 5+ years of experience leading technologists to manage, anticipate and solve complex technical items within your domain of expertise. Experience in building or supporting environments on AWS using Terraform, which includes working with services like EKS, ELB, RDS, and S3. Strong understanding of business technology drivers and their impact on architecture design, performance, and monitoring best practices. Dynamic individual with excellent communication skills, capable of adapting verbiage and style to the audience at hand and delivering critical information in a clear and concise manner. Strong experience in managing stakeholders at all levels. Strong analytical thinker with business acumen and the ability to assimilate information quickly, with a solution-based focus on incident and problem management. Hands-on experience with one or more cloud computing platform providers Experience in architecting for private and public cloud environments and in re-engineering and migrating on-premises data solutions to the cloud. Proficiency in building on emerging cloud server less managed services to minimize or eliminate physical and virtual server footprints. Experience with high-volume, mission-critical applications and their interdependencies with other applications and databases. Proven work experience with container platforms such as Kubernetes. Strong understanding of architecture, design, and business processes. Keen understanding of financial and budget management, control, and optimization of public cloud expenses. Experience working in large, collaborative teams to achieve organizational goals. Passionate about building an innovative culture. Preferred qualifications, capabilities, and skills bachelors /masters degree in Computer science or other technical, scientific discipline Experience implementing multi-cloud architectures and deep understanding of cloud infrastructure design, architecture, and cloud migration strategies. Demonstrated proficiency in technical solutions, implementing firm wide solutions and experience in data governance vendor product knowledge is a plus. Certifications in target areas (AWS Cloud/Kubernetes etc) Experience leading Data Governance and Data Risk Reporting platforms is a preferred. If you are a software engineering leader ready to take the reins and drive impact, we've got an opportunity just for you.
Posted 1 week ago
5.0 - 9.0 years
25 - 30 Lacs
Pune
Work from Office
Ethoca, a Mastercard Company is seeking a Senior Data Engineer to join our team in Pune, India to drive data enablement and explore big data solutions within our technology landscape. The role is visible and critical as part of a high performing team - it will appeal to you if you have an effective combination of domain knowledge, relevant experience and the ability to execute on the details. You will bring cutting edge software and full stack development skills with advanced knowledge of cloud and data lake experience while working with massive data volumes. You will own this - our teams are small, agile and focused on the needs of the high growth fintech marketplace. You will be working across functional teams within Ethoca and Mastercard to deliver on cloud strategy. We are committed in making our systems resilient and responsive yet easily maintainable on cloud. Key Responsibilities Design, develop, and optimize batch and real-time data pipelines using Snowflake, Snowpark, Python, and PySpark. Build data transformation workflows using dbt, with a strong focus on Test-Driven Development (TDD) and modular design. Implement and manage CI/CD pipelines using GitLab and Jenkins, enabling automated testing, deployment, and monitoring of data workflows. Deploy and manage Snowflake objects using Schema Change, ensuring controlled, auditable, and repeatable releases across environments. Administer and optimize the Snowflake platform, handling performance tuning, access management, cost control, and platform scalability. Drive DataOps practices by integrating testing, monitoring, versioning, and collaboration into every phase of the data pipeline lifecycle. Build scalable and reusable data models that support business analytics and dashboarding in Power BI. Develop and support real-time data streaming pipelines (eg, using Kafka, Spark Structured Streaming) for near-instant data availability. Establish and implement data observability practices, including monitoring data quality, freshness, lineage, and anomaly detection across the platform. Plan and own deployments, migrations, and upgrades across data platforms and pipelines to minimize service impacts, including developing and executing mitigation plans. Collaborate with stakeholders to understand data requirements and deliver reliable, high-impact data solutions. Document pipeline architecture, processes, and standards, promoting consistency and transparency across the team. Apply exceptional problem-solving and analytical skills to troubleshoot complex data and system issues. Demonstrate excellent written and verbal communication skills when collaborating across technical and non-technical teams. Required Qualifications Tenured in the fields of Computer Science/Engineering or Software Engineering. Bachelors degree in computer science, or a related technical field including programming. Deep hands-on experience with Snowflake (including administration), Snowpark, and Python. Strong background in PySpark and distributed data processing. Proven track record using dbt for building robust, testable data transformation workflows following TDD. Familiarity with Schema Change for Snowflake object deployment and version control. Good to have familiarity with Java JDK 8 or greater and exposure to Spring & Springboot framework. Good to have understanding and knowledge on Databricks Proficient in CI/CD tooling, especially GitLab and Jenkins, with a focus on automation and DataOps. Experience with real-time data processing and streaming pipelines. Strong grasp of cloud-based database infrastructure (AWS, Azure, or GCP). Skilled in developing insightful dashboards and scalable data models using Power BI. Expert in SQL development and performance optimization. Demonstrated success in building and maintaining data observability tools and frameworks. Proven ability to plan and execute deployments, upgrades, and migrations with minimal disruption to operations. Strong communication, collaboration, and analytical thinking across technical and non-technical stakeholders. Ideally you have experience in banking, e-commerce, credit cards or payment processing and exposure to both SaaS and premises-based architectures. In addition, you have a post-secondary degree in computer science, mathematics, or quantitative science.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
To assist and execute NDT data analysis drive improvements in NDT scanning and analysis towards securing ZERO defect escapement to customers and eliminate over processing. Competency Development & Resource Adequacy: Develop training materials and samples to facilitate classroom and practical trainings. Plan execute and monitor classroom and practical trainings for NDT level 1 and 2 certifications in the plant. Make sure plants are equipped with needed certified personnel to perform inspections. Maintain skill matrix of quality staff in plants as per LMWP competency model. Track the certification database and ensure only certified resources performing inspection in the plants. Improvement of internal or an alternate certification process to make it leaner and more effective. Consistently focus on improving competence level of NDT technicians which will allow high flexibility work force in the plants. Measurement and Inspection Methods: Eliminate or reduce the variation between the NDT reviewers through periodic training and assessments. Lead and/or support new development of NDT methods and tooling. Consistently improve or update the NDT process and methods. Focus on develop and implementation of Poka-Yoke solution in inspection and measurement processes. Execution and implementation of new inspection methods Geometrical verification methods new technologies and new AC after the completion of development from Engineering Drive improvements in existing NDT methods and processes to make the processes lean and effective. Lead tactical projects to meet functional strategic plan. Quality Compliance Focus on Proactive approach in assuring process compliance before failure occurs. Plan and execute NDT process audits NDT personnel review and periodic data as per defined frequency. Monitor inspection effectiveness support RCA and CAPA closure with stakeholders. Drive improvements in the audit process to make it is lean and effective. Follow up on audit findings closure. Bring improvement in BMS process and tools related to process audits. Perform RCA for reoccurring defects in the process and improve the quality of products. Operational process and support: Perform NDT data analysis accurately to secure Zero defect escapement and zero over processing. Provide on time analysis support and feedback to the plants for smooth operations in the plant. Drive improvements towards improving scanning methods and obtaining better data quality. Support plant NDT resources towards eliminating rescanning (First Time Right scan) Govern and monitor KPIs defined within the function/team and secure performance level is sustained. Maintain operational Quality records up to date. Set base line for all NDT activities and analysis and strive improvements in tools and equipment that would make the NDT process more effective. Support technology projects new product launch Quality issue projects from technical standpoint. Tracking monitoring and improving performance of gauge R & R in plants. Assure effective implementation of calibration process in relevant inspection methods. Training and implementation of new AC Monitor the permanence of process and optimize the inspection frequency based on SPC analysis. Provide on time support to the manufacturing plants on daily operational challenges. Required Qualifications: A bachelor degree in engineering or equivalent with experience in Quality domain and experience level minimum 3+years Certified UT level 2 in conventional and advanced Phased Array (PAUT) methods as per ASNT SNT-TC-1A. UT level 3 certification will be an added strength. Certified in IR inspection method. Minimum of 5 yearswork experience in Manufacturing preferably in blade manufacturing with UT inspections. International experience and cultural awareness covering Americas Europe India and China. Knowledge of blade manufacturing is preferable combined with explicit knowledge on Quality tools Systems and Processes Audits Six Sigma PFMEA Control plans PPAP. Strong English language skill (verbal and writing). Preferable to have an ISO 9001 Lead auditor certification and relevant audit experience. Flexible to travelling across LM business units for executing training and operational support. Flexible to work in shift pattern to support manufacturing plant across globe. Desired Characteristics A person with a quality mindset independent of Plant level responsibility and reporting A person with self-motivation and encourages others to take responsibility. Communication: Effectively communicate beyond own area at all levels. Initiates or improves the way to communicate facilitate negotiate resulting in increased impact and commitment. Target important areas for innovation and evaluates multiple solutions beyond own areas. Decision making: Sets goals and regularly follow up on these goals. Takes decisions and monitors results.
Posted 1 week ago
3.0 - 8.0 years
20 - 25 Lacs
Ahmedabad
Work from Office
As a HR Talent Acquisition Applicant Tracking Platform Owner at Infineon, you hold the key to unlocking the full potential of digital technologies in enhancing HR processes and elevating candidate and employee experience. Join us on this journey, and together, lets align Infineons people objectives with cutting-edge digital solutions - a customer centric HR system landscape that redefines the future of HR. In your new role you will: Be globally responsible for the design, implementation and continuous improvement of our Talent Acquisition (TA) Applicant Tracking platform within our Global HR Platforms team. You will focus on managing platform demands, ensuring high HR Data Quality, GDPR compliance and audit readiness. Besides, you will closely collaborate with HR, IT, Labor Relations and Business Continuity counterparts in global HR projects and beyond. Interface to other Talent Acquisition Platform and Module owners , ensuring alignment with relevant stakeholders and managing change request to TA applicant tracking platform Coordinate together with other talent acquisition platform and module owners demand management and prioritize demands with key stakeholders in alignment with IT counterpart(s) Set policies and guidelines for the platform to ensure that it operates smoothly and is Global Data Protection Regulation (GDPR) compliant , eg manage & monitor data deletion and access concepts Together with other talent acquisition platform and module owners proactively drive decision making on direction and focus topics for the artificial intelligence driven platforms Define and drive actions to improve TA data quality together with HR Data Quality Owner Drive automation and digitalization via TA applicant tracking platform for related processes in close collaboration with Global Service Designer and IT Support and consult in global HR projects related to our TA applicant tracking platform Enable platform stakeholders on platform usage, changes, issues and dependencies Ensure that all platform releases are thoroughly tested and validated before deployment You are best equipped for this task if you have: Customer centricity and an effective HR system landscape is at the heart of your thoughts and actions; you demonstrate excellent communication skills and know how to establish sustainable relations. You are willing to take responsibility while generating value with your ideas and solutions. Moreover, you enjoy working in interdisciplinary teams with multicultural backgrounds. A degree in Human Resources Management, Information Technology, Business Administration, or related fields 3+ years of relevant working experience in a multinational working environment in a similar role Strong communication skills: you master conveying the benefits of technical adjustments to a non-technical savvy audience and are able to translate business (HR) demands into technical requirements Strong stakeholder and expectation management skills Experience working in and managing HR (recruiting) systems , like Umantis Applicant tracking system, SuccessFactors, Eightfold, or similar Innovation, customer centric and problem-solving mindset , combined with hands-on spirit and great planning capabilities Team spirit and knowledge about change management in larger globally operating organizations Excellent English Skills
Posted 1 week ago
3.0 - 6.0 years
9 - 13 Lacs
Hyderabad
Work from Office
In SAP data and analytics at PwC, you will specialise in providing consulting services for data and analytics solutions using SAP technologies. You will analyse client requirements, design and implement data management and analytics solutions, and provide training and support for effective utilisation of SAP data and analytics tools. Working in this area, you will work closely with clients to understand their data needs, develop data models, perform data analysis, and create visualisations and reports to support datadriven decisionmaking, helping them optimise their data management processes, enhance data quality, and derive valuable insights from their data to achieve their strategic objectives. BW on HANA/BW4HANA implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop SAP BW4 HANA Solutions Understand the integration and consumption of BW data models/repots with other Tools Responsibilities Handson experience in SAP BW/4HANA or SAP BW ON HANA and strong understanding of usage of objects like Composite providers, Open ODS view, advanced DSOs, Transformations, exposing BW models as HANA views, mixed scenarios, and performance optimization concepts such as data tiering optimization. Experience in integration of BW with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques in BW/4HANA. knowledge of available SAP BW/4HANA tools and its usage like BW/4HANA Web Cockpit. Mandatory skill sets Full life cycle Implementation experience in SAP BW4HANA or SAP BW on HANA Hands on Experience in data extraction using standard or generic data sources. Good Knowledge of data source enhancement Strong experience in writing ABAP/AMDP code for exits, Transformation. Strong understanding CKF, RKF, Formula, Selections, Variables and other components used for reporting. Preferred skill sets Understanding of LSA/LSA++ architecture and its development standards. Good understanding of BW4 Application and Database security concepts. Functional Knowledge of various Modules like SD, MM, FI. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred Required Skills SAP Business Warehouse Microsoft Azure
Posted 1 week ago
3.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
BW on HANA/BW4HANA implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop SAP BW4 HANA Solutions Understand the integration and consumption of BW data models/repots with other Tools Responsibilities Handson experience in SAP BW/4HANA or SAP BW ON HANA and strong understanding of usage of objects like Composite providers, Open ODS view, advanced DSOs, Transformations, exposing BW models as HANA views, mixed scenarios, and performance optimization concepts such as data tiering optimization. Experience in integration of BW with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques in BW/4HANA. knowledge of available SAP BW/4HANA tools and its usage like BW/4HANA Web Cockpit. Mandatory skill sets Full life cycle Implementation experience in SAP BW4HANA or SAP BW on HANA Hands on Experience in data extraction using standard or generic data sources. Good Knowledge of data source enhancement Strong experience in writing ABAP/AMDP code for exits, Transformation. Strong understanding CKF, RKF, Formula, Selections, Variables and other components used for reporting. Preferred skill sets Understanding of LSA/LSA++ architecture and its development standards. Good understanding of BW4 Application and Database security concepts. Functional Knowledge of various Modules like SD, MM, FI. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred Required Skills SAP Business Warehouse
Posted 1 week ago
0.0 - 3.0 years
20 - 25 Lacs
Hyderabad
Work from Office
YOUR IMPACT Are you passionate about developing mission-critical, high quality software solutions, using cutting-edge technology, in a dynamic environment We are Compliance Engineering, a global team of more than 300 engineers and scientists who work on the most complex, mission-critical problems. We: build and operate a suite of platforms and applications that prevent, detect, and mitigate regulatory and reputational risk across the firm. have access to the latest technology and to massive amounts of structured and unstructured data. leverage modern frameworks to build responsive and intuitive UX/UI and Big Data applications. The firm is making a significant investment to uplift and rebuild the Compliance application portfolio in 2025. To achieve that we are hiring experienced software development engineers. HOW YOU WILL FULFILL YOUR POTENTIAL As a member of our team, you will: partner globally with sponsors, users and engineering colleagues across multiple divisions to create end-to-end solutions, learn from experts, leverage various technologies including; Java, SpringBoot, Hibernate, BPMN workflows, Rules Engine, JavaScript, TypeScript, React-Redux, REST APIs, GraphQL, Elastic Search, Kafka, Kubernetes, Machine Learning be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, including negotiating data contracts, capturing data quality metrics, processing large scale data, building surveillance detection models, be involved in the full life cycle; defining, designing, implementing, testing, deploying, and maintaining software systems across our products. QUALIFICATIONS A successful candidate will possess the following attributes: A Bachelors or Masters degree in Computer Science, Computer Engineering, or a similar field of study. Expertise in java, as we'll as proficiency with databases and data manipulation. Experience in end-to-end solutions, automated testing and SDLC concepts. The ability (and tenacity) to clearly express ideas and arguments in meetings and on paper. Experience in some of following is desired and can set you apart from other candidates: knowledge of the financial industry and compliance or risk functions, ability to influence stakeholders.
Posted 1 week ago
4.0 - 6.0 years
17 - 22 Lacs
Hyderabad
Work from Office
Overview Act as an expert on consumer insights with focus on social listening to answer business questions in a compelling and engaging way. This expertise will include Understanding the PepsiCo trends framework, leverage available technology stack and provide insights based on business partner request by connecting relevant data sources Available tools to analyze consumer trends from market manifestations based on Big Data Trendscope to identity and analyze Food and Beverage trends ai to produce inspiring Springboards about territories and platforms based on digital conversations Social Listening Sprinklr ADA Innovation and creative evaluation Responsibilities Execution of research projects with quality and depth of deliverables with low/no support from external vendors. Ensuring to tell the story in a compelling way, putting together all BIG data (what is happening) and THICK data (human motivations and drivers) tools at our disposal. The analyst will be the responsible for producing complete analysis and one page summary for all projects conducted. The analyst will also present his/her work to the local PepsiCo business teams who requested this work. Key tasksEnd to end delivery of alignment on the brief, proposal coordination, execution and delivery of results. Lead social listening projects from the brief to the outputs delivery - Translate business Market and Business Challenges into a Social Listening brief Ensure highest level of data quality and validation Qualifications Social Listening Expertise with a heavy focus on Insights vs Reporting 4-6 years of experience preferrable at a FMCG company / client, making an impact in a market research/insights/analytics, marketing, competitive intelligence, or other similar function with demonstrated ability to execute projects in a complex environment with multiple constituencies Very comfortable in running in depth Consumer research analyses, ability to turn findings into compelling and insightful stories and present them to Business teams. Understanding of Brand and Innovation strategy process and Insights critical roles at each stage. Experience in working on Trends and Foresight project E.g. Pre and Post COVID impact, consumer trend changes, etc. Experience in projects involving flavor innovation, trending ingredients, health benefits, consumer behavior. Demonstrated skills with written communication especially in PowerPoint and email Strong verbal and written communication English Project Management Highly analytical, motivated, decisive with excellent project management skills. OrganizedCapable of juggling multiple projects, priorities, and stakeholders, ensure delivery while proactively managing trade-offs. Demonstrated ability to manage projects and overcome challenges Ability to influence local insights partners in their ways of working Ability to run consumer research analyses alone by leveraging various available data sources
Posted 1 week ago
10.0 - 15.0 years
27 - 32 Lacs
Hyderabad
Work from Office
Overview DQ Expert will act as individual contributor enforcing strict Enterprise Data Management strategy through globaly defined data standards and governance in order to successfuly deliver business transformation within SAP S/4 Hana projects. Data Quality Expert will be responsible for delivery of internal data quality application to support data readiness and conversion activities for project and new market deployment assuring Global Template data standards are followed. This role involves active engagement in requirements gathering, testing, data cleansing, issue analysis and resolution, data conversion, and mock/cutover conversion activities. Position holder must work directly with multiple project fuction specialists, ex.OTC, P2P, MTD, as part of extended Data Conversion team on day to day basis as well as engaging the market business process/data owners . Responsibilities Partner with Data Validation Team to ensure Quality of migrated data Ensure global lift and shift opportunities are deployed across the sectors Manage questions and clear path for developers to complete build/test of data validation Work with Global on design updates/new patterns Manager the overall tracker for conversion stats and Provide direction & guidance to conversion team\" Qualifications Minimum Bachelors degree is required. Computer Science or Information Systems is preferred. Minimum 10 years in IT in ERP transformation programs in Data Management area. Experience in at least 3 end to end implementations of SAP ERP/ECC with responsibility for Data Quality and Master data standards. Experience of working with/manipulating big data sets (or systems built on significant datasets) Knowledge of SAP Master data models Data Readiness, Conversion, Migration and Cutover experience from a functional standpoint Understanding of Data Quality/Data cleansing practices Demonstrated documentation acumen, presentation of data standards materials for reporting to project and stakeholders alignment.
Posted 1 week ago
6.0 - 11.0 years
25 - 27 Lacs
Hyderabad
Work from Office
Overview We are seeking a highly skilled and experienced Azure Data Engineer to join our dynamic team. In this critical role, you will be responsible for designing, developing, and maintaining robust and scalable data solutions on the Microsoft Azure platform. You will work closely with data scientists, analysts, and business stakeholders to translate business requirements into effective data pipelines and data models. Responsibilities Design, develop, and implement data pipelines and ETL/ELT processes using Azure Data Factory, Azure Databricks, and other relevant Azure services. Develop and maintain data lakes and data warehouses on Azure, including Azure Data Lake Storage Gen2 and Azure Synapse Analytics. Build and optimize data models for data warehousing, data marts, and data lakes. Develop and implement data quality checks and data governance processes. Troubleshoot and resolve data-related issues. Collaborate with data scientists and analysts to support data exploration and analysis. Stay current with the latest advancements in cloud computing and data engineering technologies. Participate in all phases of the software development lifecycle, from requirements gathering to deployment and maintenance Qualifications 6+ years of experience in data engineering, with at least 3 years of experience working with Azure data services. Strong proficiency in SQL, Python, and other relevant programming languages. Experience with data warehousing and data lake architectures. Experience with ETL/ELT tools and technologies, such as Azure Data Factory, Azure Databricks, and Apache Spark. Experience with data modeling and data warehousing concepts. Experience with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Experience with Agile development methodologies. Bachelor's degree in Computer Science, Engineering, or a related field (Master's degree preferred). Relevant Azure certifications (e.g., Azure Data Engineer Associate) are a plus
Posted 1 week ago
12.0 - 17.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview We are seeking an experienced and strategic leader to join our Business Intelligence & Reporting organization as Deputy Director BI Governance. This role will lead the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centres. The ideal candidate will bring deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement to ensure a standardized, scalable, and value-driven BI ecosystem across the enterprise. Responsibilities Key Responsibilities Governance Leadership Define and implement the enterprise BI governance strategy, policies, and operating model. Drive consistent governance processes across sectors and global capability centers. Set standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Stakeholder Management Serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Lead governance councils, working groups, and decision forums to drive adoption and compliance. Policy and Compliance Establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implement approval and exception processes for BI development outside the COE. Demand and Intake Governance Lead the governance of BI demand intake and prioritization processes. Ensure transparency and traceability of BI requests and outcomes across business units. Metrics and Continuous Improvement Define KPIs and dashboards to monitor BI governance maturity and compliance. Identify areas for process optimization and lead continuous improvement efforts. Qualifications Experience12+ years in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain ExpertiseStrong understanding of BI platforms (Power BI, Tableau, etc.), data management practices, and governance frameworks Strategic MindsetProven ability to drive change, influence at senior levels, and align governance initiatives with enterprise goals. Operational ExcellenceExperience managing cross-functional governance processes and balancing centralized control with local flexibility. EducationBachelor's degree required; MBA or Masters in Data/Analytics preferred.
Posted 1 week ago
10.0 - 15.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Overview Data Engineering Assoc Manager (L09). Responsibilities Enhance and maintain data pipelines on EDF Requirement analysis, data analysis Work on application migration from Teradata to EDF Lead a team of data engineers and testers Qualifications Data engineer with 10+ years of experience
Posted 1 week ago
7.0 - 12.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Overview PepsiCo is on a significant initiative of digitalization and standardization of the FP&A solution across all its markets in alignment with the Planning 2025 vision to make the finance organization more Capable, more Agile, and more Efficient. Mosaic Program is a key enabler of that vision, It is FP&A solution of the PepsiCo. Responsibilities The NA Mosaic Sustain Developer is responsible for the sustain of high-quality solution for the MOSAIC Noth America program specific to management of the financial planning. The role will interact directly on the design/development and maintenance of the solution and will have to work closely with the various detailed design and development teams. This role will require a strong background in financial planning and sub streams (Topline, COGS, Opex) data quality/data flow and development. Qualifications University education (BE/BTech/B.Sc) or equivalent work experience Minimum of 7+ years of information Technology or business experience Strong understanding of the financial planning process, revenue management principles and sales finance forecasting. For years of Experience, provide detail such as 5+ year experience in TM1 Planning Analytics by IBM development 3+ year experience in TM1 Planning Analytics by IBM support Mandatory Tech skills Knowledge of IBM PLANNING ANALYTICS(TM1) solution Ability to understand and debug complex TM1 code (processes and rules) Ability to write complex TM1 code (processes and rules) Sound understanding and implementation of TM1 parallel processing. Experience in building PAW based reports. Functional Knowledge of FP&A (Financial Planning and Analysis) Soft Skills Data flow and integration as a critical component Self-motivation and ability to stay focused Ability to drive complex business discussions to design the best solution. Knowledge of FMCG and FP&A related data objects Ability to search for new solutions to meet challenges together with the team. Good communication skills Ability to leverage relationships to understand, document and communicate processes and change implications Achieved Ability to handle complexity and to execute with excellence under pressure. Conceptual Selling Deployment Planning and Execution Relationship Management and Service Technology Innovation Process Design and Architecture
Posted 1 week ago
0.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Overview The Data Analyst will play a critical role in the success of the Mosaic (global planning tool). Mosaic is transforming the way FP&A teams work across PepsiCo markets and the level of financial information available for the senior leadership teams. The Data Analyst will be responsible for the ongoing live market support focussing mostly on resolution of data issues related to the staging/ETL area of the data, providing guidance on data sources and connectivity, system issues and data transformation logic, root cause analysis and coordination on solution deployment. Additionally, he/she will be key in understanding and closing data quality gaps in current system and assisting local teams by supporting their data preparation to be MOSAIC ready. The role will require working closely with the IT/BRM, the Sector FP&A, the Cockpit and other functions teams (Net Revenue Management, Global Procurement, Coman, Supply Chain, etc.). Responsibilities Live market support Conduct thorough data validation to ensure data pipelines meet business requirements. Gain knowledge on how data is being processed and transformed from different sources and prepared for the Mosaic product. Assist in ad-hoc analytics, troubleshoot tools, and provide direct support to end-users Deep understanding of data quality and cleansing requirements for the data to be ready to be consumed in BI and SPOT Bridge the gap and coordinate with tech and FPA teams to ensure the data quality Support sustainable data solution Collaborate with business users, data engineers, product owners and BI developers to design and implement end-to-end data solutions. Oversee data processes with detailed DQR notifications, proactively monitoring ETL pipelines to address any issues Qualifications MBA, CA, CMA, Any degree in Finance
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi