Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 13.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Manager & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. s Job title Senior Manager About the role As a Senior Manager, youll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving datadriven decisionmaking within the organization. Responsibilities Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with crossfunctional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required skills & experience Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and costeffectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for crossfunctional teamwork and defining data requirements. Skills Cloud Azure/GCP/AWS DE Technologies ADF, Big Query, AWS Glue etc., Data Lake Snowflake, Data Bricks etc., Mandatory skill sets Cloud Azure/GCP/AWS DE Technologies ADF, Big Query, AWS Glue etc., Data Lake Snowflake, Data Bricks etc. Preferred skill sets Cloud Azure/GCP/AWS DE Technologies ADF, Big Query, AWS Glue etc., Data Lake Snowflake, Data Bricks etc. Years of experience required 1013years Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills AWS Glue, Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation {+ 28 more} Travel Requirements Government Clearance Required?
Posted 2 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Director & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . s Technical Requirements EPBCS/PBCS, Essbase Experience in application performance tuning Report development experience using Smartview and Hyperion Financial Reporting Studio Integration experience using Data Management is preferred Candidate Profile At least 1 domestic client facing implementation experience Should be well versed with design and development of various Planning components such as data forms, business rules, tasklists, Plan Types (BSO, ASO), EPM Automate, Calculation scripts and Workflow Good Communication skills Mandatory skill sets Oracle EPM Preferred skill sets Oracle EPM Years of experience required 1725 Education qualification Btech/MBA/MCA Education Degrees/Field of Study required Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred Required Skills Oracle Enterprise Performance Management (EPM) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 28 more} Travel Requirements Government Clearance Required?
Posted 2 weeks ago
4.0 - 7.0 years
25 - 30 Lacs
Gurugram
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary In SAP data and analytics at PwC, you will specialise in providing consulting services for data and analytics solutions using SAP technologies. You will analyse client requirements, design and implement data management and analytics solutions, and provide training and support for effective utilisation of SAP data and analytics tools. Working in this area, you will work closely with clients to understand their data needs, develop data models, perform data analysis, and create visualisations and reports to support datadriven decisionmaking, helping them optimise their data management processes, enhance data quality, and derive valuable insights from their data to achieve their strategic objectives. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary BW on HANA/BW4HANA implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop SAP BW4 HANA Solutions Understand the integration and consumption of BW data models/repots with other Tools s Handson experience in SAP BW/4HANA or SAP BW ON HANA and strong understanding of usage of objects like Composite providers, Open ODS view, advanced DSOs, Transformations, exposing BW models as HANA views, mixed scenarios, and performance optimization concepts such as data tiering optimization. Experience in integration of BW with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques in BW/4HANA. knowledge of available SAP BW/4HANA tools and its usage like BW/4HANA Web Cockpit. Full life cycle Implementation experience in SAP BW4HANA or SAP BW on HANA Hands on Experience in data extraction using standard or generic data sources. Good Knowledge of data source enhancement. Mandatory skill sets Strong experience in writing ABAP/AMDP code for exits, Transformation. Strong understanding CKF, RKF, Formula, Selections, Variables and other components used for reporting. Understanding of LSA/LSA++ architecture and its development standards. Good understanding of BW4 Application and Database security concepts. Functional Knowledge of various Modules like SD, MM, FI. Preferred skill sets Hands on experience for SAP BW for almost 47 years of experience. Years of experience required 47 years experience on SAP BW. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc.) Education Degrees/Field of Study required Master of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills SAP Business Warehouse Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Communication, Complex Data Analysis, Creativity, Data Analysis Software, Data Collection, DataDriven Consulting, Data Integration, Data Mining, Data Modeling, Data Preprocessing, Data Quality, Data Quality Improvement Plans (DQIP), Data Security, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism {+ 10 more} No
Posted 2 weeks ago
4.0 - 7.0 years
6 - 10 Lacs
Gurugram
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary In SAP data and analytics at PwC, you will specialise in providing consulting services for data and analytics solutions using SAP technologies. You will analyse client requirements, design and implement data management and analytics solutions, and provide training and support for effective utilisation of SAP data and analytics tools. Working in this area, you will work closely with clients to understand their data needs, develop data models, perform data analysis, and create visualisations and reports to support datadriven decisionmaking, helping them optimise their data management processes, enhance data quality, and derive valuable insights from their data to achieve their strategic objectives. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary DataSphere implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop DataSphere Solutions Understand the integration and consumption of data models with other Tools Responsibilities Handson experience in DataSphere and strong understanding of usage of objects like Analytical model, views, Data Flow, Replication Flow, Task Chain and performance optimization concepts such as data tiering optimization. Experience in integration of DataSphere with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques. full life cycle Implementation experience in SAP DataSphere Hands on Experience in data extraction using standard or Custom extraction. Strong experience in writing Sql. Functional Knowledge of various Modules like SD, MM, FI. Connectivity of SAP DataSphere with reporting tool like SAC, PBI . Mandatory skill sets Experience in data integration using SAP data sphere connectors ( SAP S/4HANA,BW,etc) Proficiency in SQL,CDS View& SAP HANA Modeling. Good understanding of SAP BTP architecture. Preferred skill sets Knowledge of SAP Analytics Cloud for data visualization & reporting. Experience with SAP BW/4HANA , SAP HANA cloud, or SAP integration Suite. Understanding of Cloud Data platforms. SAP certifications in Data Sphere, HANA are a plus. Years of Experience required 4 to 7 Years Education qualification B.Tech / M.Tech (Computer Science,Mechanical, Mathematics & Scientific Computing etc.) Education Degrees/Field of Study required Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills SAP Datasphere Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Communication, Complex Data Analysis, Creativity, Data Analysis Software, Data Collection, DataDriven Consulting, Data Integration, Data Mining, Data Modeling, Data Preprocessing, Data Quality, Data Quality Improvement Plans (DQIP), Data Security, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism {+ 10 more} No
Posted 2 weeks ago
4.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary Design, develop, and maintain scalable big data solutions. Implement data processing pipelines using PySpark and Hadoop. Develop and optimize SQL queries for data extraction and analysis. Manage and maintain HDFS for efficient data storage and retrieval. Responsibilities Utilize Hive for data warehousing and querying. Collaborate with data scientists and analysts to understand data requirements and deliver solutions. Ensure data quality and integrity throughout the data lifecycle. Monitor and troubleshoot data workflows to ensure optimal performance Mandatory skill sets Proficiency in Python Strong programming skills in Python for data manipulation and analysis. SQL Expertise Advanced knowledge of SQL for querying and managing databases. PySpark Experience with PySpark for big data processing. Hadoop Handson experience with Hadoop ecosystem components. Hive Basics Understanding of Hive for data warehousing and querying. HDFS Proficiency in HDFS for data storage and management Preferred skill sets Hands on experience for SAP BW for almost 47 years of experience. ProblemSolving Skills Ability to solve complex data problems and optimize workflows. Collaboration Strong communication and teamwork skills to work effectively with crossfunctional teams. Years of experience required 47 years experience on SAP BW. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc.) Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills Extract Transform Load (ETL), Structured Query Language (SQL) Development Amazon Web Services (AWS), Microsoft Azure No
Posted 2 weeks ago
7.0 - 10.0 years
13 - 17 Lacs
Gurugram
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary In SAP data and analytics at PwC, you will specialise in providing consulting services for data and analytics solutions using SAP technologies. You will analyse client requirements, design and implement data management and analytics solutions, and provide training and support for effective utilisation of SAP data and analytics tools. Working in this area, you will work closely with clients to understand their data needs, develop data models, perform data analysis, and create visualisations and reports to support datadriven decisionmaking, helping them optimise their data management processes, enhance data quality, and derive valuable insights from their data to achieve their strategic objectives. Why PWC At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations & Summary BW on HANA/BW4HANA implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop SAP BW4 HANA Solutions Understand the integration and consumption of BW data models/repots with other Tools Responsibilities Handson experience in SAP BW/4HANA or SAP BW ON HANA and strong understanding of usage of objects like Composite providers, Open ODS view, advanced DSOs, Transformations, exposing BW models as HANA views, mixed scenarios, and performance optimization concepts such as data tiering optimization. Experience in integration of BW with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques in BW/4HANA. knowledge of available SAP BW/4HANA tools and its usage like BW/4HANA Web Cockpit. Full life cycle Implementation experience in SAP BW4HANA or SAP BW on HANA Hands on Experience in data extraction using standard or generic data sources. Good Knowledge of data source enhancement. Mandatory skill sets Strong experience in writing ABAP/AMDP code for exits, Transformation. Strong understanding CKF, RKF, Formula, Selections, Variables and other components used for reporting. Understanding of LSA/LSA++ architecture and its development standards. Good understanding of BW4 Application and Database security concepts. Functional Knowledge of various Modules like SD, MM, FI. Preferred skill sets Hands on experience for SAP BW for almost 710 years of experience. Years of experience required 610 years experience on SAP BW. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc.) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred Required Skills SAP Business Warehouse Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Coaching and Feedback, Communication, Complex Data Analysis, Creativity, Data Analysis Software, Data Collection, DataDriven Consulting, Data Integration, Data Mining, Data Modeling, Data Preprocessing, Data Quality, Data Quality Improvement Plans (DQIP), Data Security, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility {+ 15 more} No
Posted 2 weeks ago
4.0 - 7.0 years
10 - 15 Lacs
Pune
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary Responsibilities Job Accountabilities Hands on Experience in Azure Data Components like ADF / Databricks / Azure SQL Good Programming Logic Sense in SQL Good PySpark knowledge for Azure Data Bricks Data Lake and Data Warehouse Concept Understanding Unit and Integration testing understanding Good communication skill to express thoghts and interact with business users Understanding of Data Security and Data Compliance Agile Model Understanding Project Documentation Understanding Certification (Good to have) Domain Knowledge Mandatory skill sets Azure DE, ADB, ADF, ADL Preferred skill sets Azure DE, ADB, ADF, ADL Years of experience required 4 to 7 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Bachelor of Engineering, Bachelor in Business Administration Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Travel Requirements Government Clearance Required?
Posted 2 weeks ago
7.0 - 10.0 years
13 - 18 Lacs
Gurugram
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary In SAP data and analytics at PwC, you will specialise in providing consulting services for data and analytics solutions using SAP technologies. You will analyse client requirements, design and implement data management and analytics solutions, and provide training and support for effective utilisation of SAP data and analytics tools. Working in this area, you will work closely with clients to understand their data needs, develop data models, perform data analysis, and create visualisations and reports to support datadriven decisionmaking, helping them optimise their data management processes, enhance data quality, and derive valuable insights from their data to achieve their strategic objectives. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary BW on HANA/BW4HANA implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop SAP BW4 HANA Solutions Understand the integration and consumption of BW data models/repots with other Tools Responsibilities Handson experience in SAP BW/4HANA or SAP BW ON HANA and strong understanding of usage of objects like Composite providers, Open ODS view, advanced DSOs, Transformations, exposing BW models as HANA views, mixed scenarios, and performance optimization concepts such as data tiering optimization. Experience in integration of BW with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques in BW/4HANA. knowledge of available SAP BW/4HANA tools and its usage like BW/4HANA Web Cockpit. Full life cycle Implementation experience in SAP BW4HANA or SAP BW on HANA Hands on Experience in data extraction using standard or generic data sources. Good Knowledge of data source enhancement. Mandatory skill sets Strong experience in writing ABAP/AMDP code for exits, Transformation. Strong understanding CKF, RKF, Formula, Selections, Variables and other components used for reporting. Understanding of LSA/LSA++ architecture and its development standards. Good understanding of BW4 Application and Database security concepts. Functional Knowledge of various Modules like SD, MM, FI. Preferred skill sets Hands on experience for SAP BW for almost 710 years of experience. Years of experience required 610 years experience on SAP BW. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc.) Education Degrees/Field of Study required Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills SAP Business Warehouse Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Coaching and Feedback, Communication, Complex Data Analysis, Creativity, Data Analysis Software, Data Collection, DataDriven Consulting, Data Integration, Data Mining, Data Modeling, Data Preprocessing, Data Quality, Data Quality Improvement Plans (DQIP), Data Security, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility {+ 15 more} No
Posted 2 weeks ago
2.0 - 4.0 years
9 - 13 Lacs
Noida
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary Responsibilities Job Accountabilities Hands on Experience in GCP Data Components (BigQuery / Data fusion / Cloudsql etc) Data Lake and Data Warehouse Understanding Manage DevOps lifecycle of project (Code Repository, Build, Release) Good to have end to end BI Landscape knowledge Participate in unit and integration testing Interaction with business users for Requirement understanding and UAT Understanding of Data Security and Data Compliance Agile Understanding Project Documentation Understanding Good SQL Knowledge Certification (Good to have) Domain Knowledge of Different Industry Sector Mandatory skill sets GCP Preferred skill sets GCP Years of experience required 2 to 4 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Bachelor in Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Snowflake Schema Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} No
Posted 2 weeks ago
5.0 - 10.0 years
4 Lacs
Coimbatore
Work from Office
Name of the position: Data Engineer Location: Remote No. of resources needed: 01 Mode: Contract Years of experience: 5+ Years Shift: UK shift Job Summary: Seeking an experienced Data Engineer with expertise in SQL, Python, and Excel to join our team. The ideal candidate will have a strong background in data cleansing and transformation, with a proven track record of delivering high-quality data solutions. Key Responsibilities: Design, develop, and maintain large-scale data systems and architectures Develop and implement data pipelines to extract, transform, and load data from various sources Collaborate with cross-functional teams to identify data requirements and develop data-driven solutions Develop and maintain data quality and data governance processes Ensure data security and compliance with regulatory requirements Stay up-to-date with emerging trends and technologies in data engineering Requirements: 5+ years of experience in data engineering with a strong focus on data cleansing and transformation Expert-level proficiency in SQL, Python, and Excel Experience with data warehousing, ETL, and data governance Strong understanding of data security and compliance regulations
Posted 2 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Gurugram
Work from Office
About Gartner Digital Markets: Gartner Digital Markets is a business unit within Gartner. Our mission is to help small businesses make the right technology choices and find the tools they need to grow, optimize, and become more effective at what they do. The business is comprised of three top brands Capterra, Software Advice, and GetApp. For candidates interested in taking their next career step in the technology space, Gartner Digital Markets offers the fast pace and excitement of working for a startup, the stability and resources of a large, established organization, and the opportunity to be on the front lines of innovation in an industry that is always growing and transforming. About the role: Gartner Digital Markets is looking for a passionate Analytics Engineer for the Data team. Analytics Engineers sit at the intersection of business teams, Data Analytics and Data Engineering and are responsible for bringing robust, efficient, and integrated data models and products to life. Analytics Engineers speak the language of business teams and technical teams, able to translate data insights and analysis needs into models powered by the Enterprise Data Platform. The successful Analytics Engineer can blend business acumen with technical expertise and transition between business strategy and data development. What You Will do: Collaborate with business and engineers to collect project requirements, define successful analytics outcomes, and design data models. Understand business processes and objectives and translate them into operational data management processes and models. Design, develop, and maintain DBT code and Snowflake tasks to build an Enterprise Dimensional Model Design efficient solutions to consolidate data from RDBMS systems, Enterprise Applications and 3rd party APIs through ELT processes into an Enterprise Data Model. Organize, optimize and debug data-driven reporting, BI and analytics applications Craft code that meets our internal standards for style, maintainability, and best practices for a high-scale data warehouse and data lake environment. Maintain and advocate for these standards through code review. Utilize the Data Platform to build data products and provide feedback to the Data platform team to build new features. Design and maintain conceptual and logical data models (e.g., Kimball, Inmon, Data Vault) and supporting ERDs. Develop, optimize, and document data transformation processes using SQL and leading tools (e.g., Snowflake, DBT). Apply best practices in code versioning, CI/CD, and workflow automation within data engineering processes. What You Will Need: 2 - 4 years experience in data engineering, analytics, or a related field. Advanced SQL proficiency and practical experience in data modelling and transformation. Familiarity with modern data transformation tools (DBT preferred) and relational databases (e.g., Snowflake, MS SQL Server, PostgreSQL). Demonstrated ability to troubleshoot, optimize, and resolve data quality and performance issues. Strong verbal and written communication skills in English. What you will get: Competitive salary, generous paid time off policy and more! India: Group Medical Insurance, Parental Leave, Employee Assistance Program (EAP) Collaborative, team-oriented culture that embraces diversity Professional development and unlimited growth opportunities Don t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-VG1 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? What do we offer? Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity.
Posted 2 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Mumbai
Work from Office
Job Description The Position We are looking for an experienced Senior Business Data Analyst & Power BI Developer to join our Analytics team. As Senior Business & Data Analyst, you will be a key player in transforming complex business challenges into tangible, data-driven solutions. You will drive analytical process automation initiatives, uncover valuable business insights through data analysis, and adopt analytical solutions to address critical business needs. As a Senior Analyst, you will also mentor and guide other team members, fostering a culture of collaboration and excellence. Responsibilities & Skills: Business Acumen: Ability to understand complex business challenges, identify opportunities for improvement, and translate them into actionable data-driven solutions. Drive process improvement initiatives by identifying bottlenecks, analyzing data to determine root causes, and designing innovative solutions that enhance efficiency and effectiveness. Design end-to-end BI reporting solutions, including dataflows, semantic models, dashboards, and reports Collaborate with stakeholders to define requirements and align reporting solutions with business goals Exceptional communication skills, enabling clear articulation of technical concepts to business stakeholders and effective collaboration with cross-functional teams. Capable of translating business requirements into technical specifications for data analytical solutions, with experience in data integration, data cleansing, and data mining techniques. Develop and implement analytical solutions to streamline reporting, analysis, and decision-making processes. Develop insightful reports and dashboards, ensuring data accuracy, security, and effective data governance practices. Proficient in SQL queries for data extraction, analysis, and manipulation, with a solid understanding of data warehousing principles. Experienced in designing and implementing efficient data structures to support reporting and analysis needs, ensuring data quality and integrity. Identifying opportunities to automate financial processes and reporting through AI technologies. Required Education, Experience, and Skills Requirements : A degree in Computer Science, Engineering, Information Systems, Business Management, Economics or a related field is required; advanced degrees are a plus. 3-5 years of relevant experience in business and data analysis, experience in Power BI and data modeling Background in data warehouse platforms such as Snowflake, combined with strong BI methodologies Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. Connecting data sources, importing data, and transforming data for Business intelligence. Experience with data visualization tools and techniques (e.g., PowerBI, Tableau). Strong Excel Skills Experience in developing and analyzing solutions using SQL scripts. Understanding of enterprise-level data integration and strategies for performance optimization Capability to bridge complex business questions into actionable data solutions Proficiency in solving technical challenges within diverse, dynamic datasets Python Knowledge Nice to have Libraries: Pandas, NumPy, ScyPy, Seaborn, Matplotlib, Statsmodels, SciKit learn. Secondary Job Description Who We Are: Organon delivers ingenious health solutions that enable people to live their best lives. We are a $6.5 billion global healthcare company focused on making a world of difference for women, their families and the communities they care for. We have an important portfolio and are growing it by investing in the unmet needs of Women s Health, expanding access to leading biosimilars and touching lives with a diverse and trusted portfolio of health solutions. Our Vision is clear: A better and healthier every day for every woman. As an equal opportunity employer, we welcome applications from candidates with a diverse background. We are committed to creating an inclusive environment for all our applicants. Search Firm Representatives Please Read Carefully Organon LLC , does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Annualized Salary Range Annualized Salary Range (Global) Annualized Salary Range (Canada) Please Note: Pay ranges are specific to local market and therefore vary from country to country. Employee Status: Regular Relocation: No relocation VISA Sponsorship: Travel Requirements: Organon employees must be able to satisfy all applicable travel and credentialing requirements, including associated vaccination prerequisites Flexible Work Arrangements: Shift: Valid Driving License: Hazardous Material(s): Number of Openings: 1
Posted 2 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Senior Data Engineer Our Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join our growing data engineering team. You ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk . You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform. What you get to do every single day: Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems What you bring to the role: Basic Qualifications 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments 5+ years of experience in Data Modeling and Data Architecture in a production environment 5+ years in writing complex SQL queries 5+ years of experience with Cloud columnar databases (We use Snowflake) 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams. Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualifications Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Demonstrated experience in one or many business domains (Finance, Sales, Marketing) 3+ completed production-grade projects with dbt Expert knowledge in python What does our data stack looks like: ELT (Snowflake, Fivetran, dbt, Airflow, Kafka, HighTouch) BI (Tableau, Looker) Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions) Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law.
Posted 2 weeks ago
2.0 - 5.0 years
10 - 11 Lacs
Chennai
Work from Office
At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. ZoomInfo is looking for a Data Analyst II to join our Data Operations and Analysis team. This position supports our broader Data team, and works closely with Product, Engineering, and Research. Key aspects of this role include supporting strategic data infrastructure decisions, discovering and driving improvements in our data processing, and providing timely and thorough analysis. You will have the opportunity to influence the future success of ZoomInfo s data assets. What you ll do Develop a deep and comprehensive understanding of the data and infrastructure we work with. Summarize and document different aspects of the current state, probing for opportunities, and charting the path forward with solutions to improve the accuracy and volume of the data we serve to our customers. Collaborate with our Data Engineering, Product Management, and Research counterparts to drive forward the improvement opportunities you ve identified Inquisitive - You are curious about ZoomInfo s product, market and data operations, and eager to develop a thorough understanding of our data and infrastructure. You pursue technical training and development opportunities, and strive to continuously build knowledge and skills. A Problem Solver - You have strong problem solving and troubleshooting skills with the ability to exercise good judgment in ambiguous situations. A Team Player - You are willing to tackle new challenges and enjoy facilitating and owning cross-functional collaboration. Within the team, you seek opportunities to mentor and coach junior analysts. Self-Directed - You enjoy working independently. There will be guidance and resources when you need it, but we re looking for someone who will thrive with the freedom to explore our data, propose solutions, and drive execution and results What you ll bring 2-5 years of experience in analytics, quantitative research, and/or data operations Knowledge of database, ETL development and challenges posed by data quality Ability to summarize complex analyses in a simple, intuitive format, and to present findings in a clear and concise manner to both technical and non-technical stakeholders Strong understanding of AI technologies, including machine learning, natural language processing, and data analytics. Must be detail oriented with strong organizational and analytical skills Strong initiative and ability to manage multiple projects simultaneously Exposure to and technical understanding of working with data at scale Experience in a product or project-driven work environment preferred Advanced level SQL, Python, BigQuery, Excel skills handling large-scale complex datasets Experience building, deploying AI / ML solutions Familiarity with data visualization tools, e.g. Tableau / Looker Studio Experience with DataBricks, AWS, or Airflow is preferred but not mandatory #LI-PR #LI-Hybrid About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller.
Posted 2 weeks ago
4.0 - 6.0 years
4 - 8 Lacs
Noida
Work from Office
Position: Data Engineer (AWS QuickSight, Glue, PySpark) (Noida) (CE46SF RM 3386) Education Required: Bachelor s / Masters / PhD: Bachelor s or master s in computer science, Statistics, Mathematics, Data Science, Engineering AWS certification (e.g., AWS Certified Data Analytics Specialty, AWS Certified Developer) Must have skills: Proficiency in AWS cloud services: AWS Glue, QuickSight, S3, Lambda, Athena, Redshift, EMR, and related technologies Strong experience with PySpark Expertise in SQL and data modeling for relational and non-relational databases Familiarity with business intelligence and visualization tools, especially Amazon QuickSight Good to have: Proficiency in Python and ML libraries (scikit-learn, TensorFlow, PyTorch). Understanding of MLOps and model deployment best practices. Hands-on experience with AWS services for ML. Experience or familiarity with HVAC domain is a plus Key Responsibilities: Design, develop, and maintain data pipelines using AWS Glue, PySpark, and related AWS services to extract, transform, and load (ETL) data from diverse sources Build and optimize data warehouse/data lake infrastructure on AWS, ensuring efficient data storage, processing, and retrieval Develop and manage ETL processes to source data from various systems, including databases, APIs, and file storage, and create unified data models for analytics and reporting Implement and maintain business intelligence dashboards using Amazon QuickSight, enabling stakeholders to derive actionable insights Collaborate with cross-functional teams (business analysts, data scientists, product managers) to understand requirements and deliver scalable data solutions Ensure data quality, integrity, and security throughout the data lifecycle, implementing best practices for governance and compliance5. Support self-service analytics by empowering internal users to access and analyze data through QuickSight and other reporting tools1. Troubleshoot and resolve data pipeline issues, optimizing performance and reliability as needed Required Skills: Proficiency in AWS cloud services: AWS Glue, QuickSight, S3, Lambda, Athena, Redshift, EMR, and related technologies Strong experience with PySpark for large-scale data processing and transformation Expertise in SQL and data modeling for relational and non-relational databases Experience building and optimizing ETL pipelines and data integration workflows Familiarity with business intelligence and visualization tools, especially Amazon QuickSight Knowledge of data governance, security, and compliance best practices. Strong programming skills in Python; experience with automation and scripting Ability to work collaboratively in agile environments and manage multiple priorities effectively Excellent problem-solving and communication skills. Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: Noida Experience: 4-6 years Notice period: 0-15 days
Posted 2 weeks ago
4.0 - 9.0 years
8 - 12 Lacs
Bengaluru
Work from Office
The purpose of this role is to maintain, improve, clean and manipulate data in the business s operational and analytics databases. The Data Engineer works with the business s software engineers, data analytics teams, data scientists and data warehouse engineers in order to understand and aid in the implementation of database requirements, analyse performance, and troubleshoot any existent issues. Job Description: Responsibilities BI Solution Development: Design, develop, and maintain BI solutions, including reports, dashboards, and data visualizations, using tools such as Power BI, Tableau, or similar technologies. Data Infrastructure Development: Design, develop, and maintain scalable data pipelines and infrastructure using tools and technologies such as Apache Spark, Hadoop, and SQL. Requirements Analysis: Collaborate with business stakeholders to gather and document BI requirements, ensuring solutions meet business needs. ETL Processes: Create and manage Extract, Transform, Load (ETL) processes to integrate data from various sources into the data warehouse. Data Integration: Extract, transform, and load (ETL) data from various sources into BI systems, ensuring data accuracy and integrity. Data Modeling: Develop and maintain data models to support BI solutions, optimizing for performance and usability. Data Quality: Implement data quality and validation processes to ensure accuracy and reliability of data. Performance Optimization: Optimize BI solutions for performance, scalability, and user experience through efficient data modeling and query design. Documentation: Create and maintain comprehensive documentation for BI solutions, including data models, ETL processes, and user guides. User Training and Support: Provide training and support to end-users on BI tools and best practices, ensuring effective use of BI solutions. Continuous Improvement: Identify and implement opportunities for improving BI processes and capabilities, staying current with industry trends and technologies. Qualification and Skills Bachelor's degree in Computer Science, Information Systems, Business Administration, or related field. Proven experience 4-9 years in a similar role with a focus on business intelligence development. Proficiency in BI tools such as Power BI, Tableau, QlikView, or similar technologies. Proficiency in data engineering tools and technologies such as Apache Spark, Hadoop, Kafka, SQL, and NoSQL databases. Strong understanding of data warehousing, data modeling, and ETL processes. Excellent analytical and problem-solving skills with the ability to interpret complex data sets. Strong communication and interpersonal skills, with the ability to work effectively with both technical and non-technical stakeholders. Certification in BI tools or relevant technology certifications incl data engineering tools Experience with cloud-based BI solutions, such as Microsoft Azure or AWS. Knowledge of advanced analytics, data visualization techniques and programming languages such as Python, Java, or Scala Experience with cloud-based data solutions, such as AWS, Azure, or Google Cloud Platform. Familiarity with data governance and compliance standards. Location: Bangalore Brand: Bcoe Time Type: Full time Contract Type: Permanent
Posted 2 weeks ago
3.0 - 6.0 years
16 - 20 Lacs
Pune
Work from Office
Master Data Analyst III-Finance Job Description You re not the person who will settle for just any role. Neither are we. Because we re out to create Better Care for a Better World, and that takes a certain kind of person and teams who care about making a difference. At Kimberly-Clark, you ll be part of the best teams committed to driving innovation and growth. It starts with YOU. About Us Huggies . Kleenex . Cottonelle . Scott . Kotex . Poise . Depend . Kimberly-Clark Professional . You already know our legendary brands and so does the rest of the world. In fact, millions of people use Kimberly-Clark products every day. We know these amazing Kimberly-Clark products wouldn t exist without talented professionals, like you. At Kimberly-Clark, you ll be part of the best team committed to driving innovation, growth and impact. We re founded on 150 years of market leadership, and we re always looking for new and better ways to perform - so there s your open door of opportunity. It s all here for you at Kimberly-Clark; you just need to log on! Led by Purpose. Driven by You. Main /Primary R esponsibilities: Provide support to the business managing master data effectively to ensure proper controls, high master data quality and efficient process performance: Perform the creation and maintenance of master data records in a timely manner and in accordance with procedures, quality standards and rules. Administer master data workflow tools, processes and the execution of mass updates. Ensure high quality and full validation of master data according to data governance standards and rules. Undertake regular data cleansing activities to raise the quality of each record to target levels. Support controls and regular checks to ensure compliance with internal control, standards and rules. Maintain VMS and identify requirements to effectively track KPIs. Keep proper maintenance of SOPs. Provide first line of support to the business in investigating and solving master data issues of medium to high level of complexity. Drive value creation supporting the business: Positively influence the business by leading training to internal customers, CI projects or leveraging data, analytics and actionable insights to deliver quantifiable results. Generate consistency, efficiency, and productivity improvements by leveraging process improvement, standardization and automation to generate white space. Propose and lead projects through the proactive identification of process gaps and interpretation of business rules and policies. Investigate the root cause behind a business process failure or reoccurring data errors which may be due to data entry errors, current process not being followed, a problem with the current process, or a system issue. Actively support test case execution for new systems and tools ensuring that business processes are not disrupted by changes. Leadership: Good communication skills. Excellent interpersonal and collaboration skills. Problem-solving skills. Analytical and critical thinking skills. Results oriented and customer focus. Superior attention to detail. Project management skills. Consistently demonstrate the KC Values (We Care, We Own, We Act) and Our Ways of Working (Focus on Consumers, Play to Win, Move Fast and Grow Our People). Functional/Business Skills: Expert in Finance Master Data principles, quality, practices and their relationship with business. Good understanding of financial and accounting concepts and processes and experience in related activities (month-end closing, costing, reporting). Knowledge of external and internal controls for Vendor and Customer Master Data and adherence to SOX control compliance. Advanced SAP Finance - FI modules user. CI/LEAN experience. Advanced Microsoft Excel proficiency . Knowledge of Power BI and Power App. Knowledge of WinShuttle , Macro and/or SAP scripting will be an advantage. Required Qualifications and Experiences: Bachelor's degree in Business Administration or Engineering or related field. 2 -6 years of Finance Master Data management experience. B2 or C1 English level. Primary Location India - Pune Additional Locations Worker Type Employee Worker Sub-Type Regular Time Type Full time
Posted 2 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about Target in India At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. About the Role As a Senior RBX Data Specialist at Target in India, involves the end-to-end management of data, encompassing building and maintaining pipelines through ETL/ELT and data modeling, ensuring data accuracy and system performance, and resolving data flow issues. It also requires analyzing data to generate insights, creating visualizations for stakeholders, automating processes for efficiency, and effective collaboration across both business and technical teams. You will also answer ad-hoc questions from your business users by conducting quick analysis on relevant data, identify trends and correlations, and form hypotheses to explain the observations. Some of this will lead to bigger projects of increased complexity, where you will have to work as a part of a bigger team, but also independently execute specific tasks. Finally, you are expected to always adhere to project schedule and technical rigor as well as requirements for documentation, code versioning, etc Key Responsibilities Data Pipeline and MaintenanceMonitor data pipelines and warehousing systems to ensure optimal health and performance. Ensure data integrity and accuracy throughout the data lifecycle. Incident Management and ResolutionDrive the resolution of data incidents and document their causes and fixes, collaborating with teams to prevent recurrence. Automation and Process ImprovementIdentify and implement automation opportunities and Data Ops best practices to enhance the efficiency, reliability, and scalability of data processes. Collaboration and CommunicationWork closely with data teams and stakeholders, to understand data pipeline architecture and dependencies, ensuring timely and accurate data delivery while effectively communicating data issues and participating in relevant discussions. Data Quality and GovernanceImplement and enforce data quality standards, monitor metrics for improvement, and support data governance by ensuring policy compliance. Documentation and ReportingCreate and maintain clear and concise documentation of data pipelines, processes, and troubleshooting steps. Develop and generate reports on data operations performance and key metrics. Core responsibilities are described within this job description. Job duties may change at any time due to business needs. About You B.Tech / B.E. or equivalent (completed) degree 5+ years of relevant work experience Experience in Marketing/Customer/Loyalty/Retail analytics is preferable Exposure to A/B testing Familiarity with big data technologies, data languages and visualization tools Exposure to languages such as Python and R for data analysis and modelling Proficiency in SQL for data extraction, manipulation, and analysis, with experience in big data query frameworks such as Hive, Presto, SQL, or BigQuery Solid foundation knowledge in mathematics, statistics, and predictive modelling techniques, including Linear Regression, Logistic Regression, time-series models, and classification techniques. Ability to simplify complex technical and analytical methodologies for easier comprehension for broad audiences. Ability to identify process and tool improvements and implement change Excellent written and verbal English communication skills for Global working Motivation to initiate, build and maintain global partnerships Ability to function in group and/or individual settings. Willing and able to work from our office location (Bangalore HQ) as required by business needs and brand initiatives Useful Links- Life at Target- https://india.target.com/Benefits- https://india.target.com/life-at-target/workplace/benefitsCulture- https://india.target.com/life-at-target/belonging
Posted 2 weeks ago
4.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Working at Target means helping all families discover the joy of everyday life. We bring that vision to life through our values and culture. Learn more about Target here . As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Working at Target means the opportunity to help all families discover the joy of everyday life. Caring for our communities is woven into who we are, and we invest in the places we collectively live, work and play. We prioritize relationships, fuel and develop talent by creating growth opportunities, and succeed as one Target team. At our core, our purpose is ingrained in who we are, what we value, and how we work. Its how we care, grow, and win together. About the Team Roundel, Targets retail media network, provides leading digital advertising solutions that connect brands with millions of Target guests in a premium and brand-safe environment. With deep insights from Targets customer base, Roundel helps advertisers drive engagement and measurable sales both online and in-store. As a Sr Product Ops Analyst within Roundel, you'll play a critical role in improving product performance and driving business outcomes. The role involves conducting exploratory data analysis to identify opportunities, building and maintaining key reports and dashboards, and working closely with Product Owners to detect and resolve product issues. You will also collaborate with Data Engineering and Data Science teams to implement new processes, activate predictive models, and enhance existing workflows. This position requires a strong test-and-learn mindset and the ability to navigate a fast-paced, evolving retail media landscape. Your efforts will directly support Roundels broader objectives by uncovering insights, addressing data quality issues, and ensuring products deliver consistent, high-impact results. Principal Duties & responsibilities Build reports using existing data to monitor key product metrics. Perform data analysis and research to identify opportunities for improving product performance. Support the design of product strategies and leverage technology to drive product efficiency. Partner with cross-functional teams such as Data Science, Engineering, Product Operations, and Business teams to execute key product initiatives and support Roundel products. Partner with product managers in product implementation plans and build roadmaps. Gather, review, and analyse data to answer business questions and guide decisions aimed at improving the Roundel product portfolio. Develop and monitor Product Health metrics that Product Managers use to track performance. Test, monitor, and measure new features in production to ensure they function as expected. Demonstrate comfort with ambiguity and apply a test-and-learn mindset. Analyse trends to uncover potential product issues and opportunities to enhance existing features. Proactively identify opportunities for improving product efficiency. Partner with the global Product Operations team to advocate for and implement best practices, processes, routines, and tools. Reporting/ Working relationshipsReports to Sr. Manager Prod Operations Analytics Job requirements Graduate in any discipline with 4-8 years experience in math, statistics, business Expertise in SQL querying and performing analytical operations on big data Understanding and hands on experience on visualization tools like Power BI, Domo, Lookr (or similar) Knowledge of data transformation using Python or R Passion for using data to explain or solve complex business problems and influence invention of new systematic and operational processes Problem solving skills and ability to multi-task Ability to communicate findings clearly and concisely Proven experience achieving results by leading, partnering, and influencing others Knowledge in Microsoft Excel, Word, PowerPoint (or similar) applications required
Posted 2 weeks ago
2.0 - 5.0 years
4 - 8 Lacs
Kochi
Work from Office
Able to write complex SQL queries Having experience in Azure Databricks Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Able to write complex SQL queries Having experience in Azure Databricks Preferred technical and professional experience Excellent communication and stakeholder management skills
Posted 2 weeks ago
3.0 - 7.0 years
16 - 20 Lacs
Mumbai
Work from Office
Act as a liaison between business stakeholders and technical teams to gather, document, and translate business requirements into data platform capabilities. Define and document data product requirements, KPIs, metrics, and business rules to be enabled via the platform. Define test scenarios and data quality checks to validate reports and dashboards. Coordinate UAT with business users and sign off deliverables. Collaborate with platform owners on roadmap definition, backlog prioritization, and delivery planning Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Prior experience working in banking or regulated project. Exposure to data product thinking and domain-driven data modeling. Understanding of data privacy, security, and compliance regulations (e.g., RBI, GDPR Preferred technical and professional experience Prior experience in the Banking domain or financial services industry. Exposure to Agile methodologies and collaborative development environments. Familiarity with other enterprise data management tools or technologies
Posted 2 weeks ago
4.0 - 8.0 years
12 - 17 Lacs
Gurugram
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary In SAP data and analytics at PwC, you will specialise in providing consulting services for data and analytics solutions using SAP technologies. You will analyse client requirements, design and implement data management and analytics solutions, and provide training and support for effective utilisation of SAP data and analytics tools. Working in this area, you will work closely with clients to understand their data needs, develop data models, perform data analysis, and create visualisations and reports to support datadriven decisionmaking, helping them optimise their data management processes, enhance data quality, and derive valuable insights from their data to achieve their strategic objectives. Why PWC & Summary BW on HANA/BW4HANA implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop SAP BW4 HANA Solutions Understand the integration and consumption of BW data models/repots with other Tools Responsibilities Handson experience in SAP BW/4HANA or SAP BW ON HANA and strong understanding of usage of objects like Composite providers, Open ODS view, advanced DSOs, Transformations, exposing BW models as HANA views, mixed scenarios, and performance optimization concepts such as data tiering optimization. Experience in integration of BW with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques in BW/4HANA. knowledge of available SAP BW/4HANA tools and its usage like BW/4HANA Web Cockpit. Mandatory skill sets Full life cycle Implementation experience in SAP BW4HANA or SAP BW on HANA Hands on Experience in data extraction using standard or generic data sources. Good Knowledge of data source enhancement Strong experience in writing ABAP/AMDP code for exits, Transformation. Strong understanding CKF, RKF, Formula, Selections, Variables and other components used for reporting. Preferred skill sets Understanding of LSA/LSA++ architecture and its development standards. Good understanding of BW4 Application and Database security concepts. Functional Knowledge of various Modules like SD, MM, FI. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc.) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred Required Skills SAP Business Warehouse Structured Query Language (SQL), Troubleshooting No
Posted 2 weeks ago
4.0 - 8.0 years
7 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Data Modeler About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Requirements: Job Title: Data Modeler Experience: 12+ years Location: Hyderabad Roles & Responsibilities Minimum of 10 years experience of Data management and modelling solutions working as a Data Modeller within the Financial Services sector is essential; preferably in a Treasury/Finance function and or related front office environment. A proven track record working in a large and global banking environment is desirable. Demonstrate experience in design data modelling solutions (conceptual, logical and application/messaging) with corresponding phasing, transitions, and migrations where necessary. Good understanding of managing data as a product (asset) principle across enterprise domains and technology landscapes. Good understanding of architectural domains (business, data, application, and technology) Good communication skills with the ability to influence and present data models (as well as concepts) to technology and business stakeholders. Good collaboration skills with the ability to demonstrate experience achieving outcomes in a matrixed environment partnering with data modellers from other domains to build and join shared and reusable data assets. Experience of working with Agile and Scrum in a large scalable Agile environment. This should include participation and progress reporting in daily standups. Data standards, data governance, data strategy and data lineage would be advantageous in this role. Knowledge of reference/master data management Cloud exposure to solutions implemented in either GCP, AWS or Azure would be beneficial as well as having exposure to big data solutions would be advantageous. Experience working with leading data modelling tools modelling documentation using tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio etc. Knowledge of data modelling standards and modelling technical documentation using Entity Relationship Diagrams (ERD) or Unified Modelling language (UML) or BIAN. Results oriented with ability to produce solutions that deliver organisational benefit. Understanding of issue and data quality management, prioritisation, business case development, remediation planning and tactical or strategic solution delivery Exposure with data governance initiatives such as lineage, masking, retention policy, and data quality Strong analytical skills and problem-solving, with the ability to work unsupervised and take ownership for key deliverables. Exposure with ETL architectures and tools, including data virtualisation, integration with APIs is desirable. Approach problems with an open mind and challenge to ensure appropriate pragmatic clean designs.
Posted 2 weeks ago
8.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Key Responsibilities: Data Engineering and Integration: Design, develop, and optimize scalable data pipelines and ETL processes using Palantir Foundry for data integration, transformation, and creating data models to support analytics and business use cases. Build and manage APIs and microservices using Python and Java to integrate data across systems and manage data processing and application logic. Data Modeling and Architecture: Design and implement robust, scalable, and efficient data models in Palantir Foundry. Collaborate with data architects to define data governance, data lineage, and data quality standards. Develop and maintain reusable pipelines and templates for data transformation and enrichment. Azure Cloud Expertise: Utilize Azure Synapse , Azure Data Lake , and Azure Storage for data storage and processing. Implement secure and efficient data workflows on Azure, ensuring compliance with organizational and regulatory policies. Monitor and troubleshoot Azure data pipelines for performance optimization. Programming and Automation: Write clean, maintainable, and efficient code in Python and Java for data processing, automation, and integration tasks. Develop scripts to automate repetitive tasks and improve overall system efficiency. Hands on Experience on Snowflake in writing procedures / functions, Snowpipe, Data Pipelines, Data Transformation. Collaboration and Stakeholder Engagement: Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Provide technical support and training on Palantir Foundry to internal teams. Participate in Agile development processes and collaborate with DevOps teams to ensure seamless deployment. Performance Monitoring and Optimization: Monitor data pipelines and applications for reliability, scalability, and performance. Implement best practices for error handling, logging, and alerting to ensure system stability. Required Skills and Qualifications: Technical Expertise: Proficiency in Palantir Foundry for data integration, modeling, and pipeline development. Strong programming skills in Python and Java . Hands-on experience with Azure Data Engineering tools (e.g., Azure Data Factory, Databricks, Synapse, Data Lake, etc.). Solid understanding of data structures, algorithms, and software engineering principles. Data Engineering Skills: Experience in building and optimizing ETL/ELT pipelines for large-scale data processing. Proficiency in SQL for data querying and transformation. Familiarity with data governance, data lineage, and data security practices. Cloud Expertise: Strong knowledge of Azure cloud services and infrastructure for data engineering. Experience with CI/CD pipelines, containerization (e.g., Docker), and orchestration tools (e.g., Kubernetes). Problem Solving and Collaboration: Excellent problem-solving skills and ability to troubleshoot complex issues in data systems. Strong communication skills to collaborate effectively with technical and non-technical stakeholders. Preferred Qualifications: Certification in Palantir or Microsoft Azure Data Engineering or related cloud certifications. Experience in working in Agile/Scrum environments. Job ID R-74482 Date posted 07/11/2025
Posted 2 weeks ago
3.0 - 6.0 years
6 - 10 Lacs
Noida
Work from Office
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Microstrategy Sr. Developer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). MicroStrategy Sr. Developer Dell Services seeks a MicroStrategy Architect to design, deliver and manage BI / Reporting solutions. This is a client facing role & the candidate will have regular interactions with various client managers. Job Responsibilities include: Participate in requirement gathering sessions and guide the requirements by providing insights of EDW. Architect, design, develop and deliver BI solutions using MicroStrategy, Java, Oracle and Teradata. Participate in technology governance groups that defines policies, best practices and make design decisions in on-going projects. Mentoring junior developers regarding best practices and technology stacks used to build the application. Leading a small team of developers distributed across different geographic locations and manages the project delivery. Conduct end user demos and trainings. Work in collaborative environments that follow agile project management methodologies like XP and Scrum. Work closely with BI and EDW System administrators for code migrations and production support. Debug data quality issues by analyzing the upstream sources and provide guidance on resolutions. Closely work with DBAs to fix performance bottlenecks. Qualifications Required Skills: 7+ years of experience in hands-on delivery capacity with clearly defined responsibilities including but not limited to, solution development, delivery, implementation, support, resource coordination and technical leadership. 5+ years of experience in BI with 3 or more in application development using MicroStrategy. End to end implementation of MicroStrategy for at least one customer. Working experience with all the components of MicroStrategy suite. Working experience in Teradata, Oracle including performance tuning and debugging performance bottlenecks. Strong SQL knowledge and ability to write most complex queries. Conceptual understanding of logical and physical data model is a must and a working experience is a plus. Ability to work in a 24 x 7 setting as either on-call or escalation contact. Excellent written and verbal communication skills with the ability to interact with senior business and technical IT management. Strong Analytical and problem solving abilities. Credentials: BA/BS in computer science or related field MicroStrategy training and certifications are beneficial #LI-INPAS
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi