Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
15 - 25 Lacs
Pune
Work from Office
Role & responsibilities: Design & Configuration: Develop and configure MDM and Data Quality tools such as Syndigo, Snowflake, and Alation. Perform data quality functions including audits, assessments, entity resolution, data profiling, scorecard development, and exception management configuration. Configure workflows for Products and Customer data in Syndigo and Winshuttle. Support the development of Master Data and IT system architecture roadmaps to improve e-commerce strategy, supply chain efficiency, and customer experience. Collaborate in an agile environment to design and build data solutions, ensuring thorough end-to-end testing. Implement data orchestration pipelines, data sourcing, cleansing, and quality control processes. Contribute to software verification plans and quality assurance procedures. Document and maintain data pipeline architecture. Contribute to IT standards, procedures, and processes. Document and maintain software functionality. Training & Support: Develop & maintain technical design documentation. Create and maintain relevant project documentation throughout SDLC Create clear and concise training documentation Lead end-to-end delivery of technical solutions Support the installations, maintenance, and upgrades of MDM tools Create test cases and support user testing throughout relevant test cycles. Preferred candidate profile: Bachelor's degree in computer science, systems analysis, or a related study, or equivalent experience. Requires mastery-level knowledge of the job area obtained either through advanced education, experience, or both. Should have a minimum of 4 years of Experience in Data Management and Data Quality solutions. Of which a minimum of 3 years of experience in the Syndigo application is required. Data Quality related certifications (i.e., GS1, CIMP, IQCP, ICP, CDMP, and CMMI Enterprise Data Management) are a plus. Should have a conceptual understanding of SAP and PLMs with a strong understanding of Key business processes involving Customer & Product Master data. Should have an in-Depth Knowledge of SQL, Python, & Snowflake. Knowledge of Cloud-based solutions and Alation/Data Governance tools (preferable) Strong Analytical and Problem-Solving Skills. Knowledge of Agile/Waterfall methodologies. Ability to effectively communicate, orally and writing, with IT & Business stakeholders Propensity to learn innovative technologies and approaches and use this knowledge to enhance Smith & Nephew's strategy, standard practices, and processes. Ability to prioritize tasks and adapt to frequent changes in priorities Ability to work in a distributed team setting and a fast-paced environment Perks and benefits Major medical coverage + policy exclusions and insurance non-medical limit. Educational Assistance. Flexible Personal/Vacation Time Off, Privilege Leave, Floater Leave. Parents/Parents-in-Laws Insurance (Employer Contribution of 8,000/- annually), Employee Assistance Program, Parental Leave. Hybrid Work Model Hands-On, Team-Customized Mentorship Extra Perks: Free Cab Transport Facility for all employees; One-Time Meal provided to all employees as per shift. Night shift allowance for shift-based roles.
Posted 2 weeks ago
3.0 - 6.0 years
18 - 22 Lacs
Mumbai
Work from Office
Overview MSCI ESG Data collection team is involved in acquisition of ESG data at scale and QA of the collected data and responsible for establishing and maintaining highest level of data quality and standards across all datasets feeding our ESG products. As an Data Collection Transformation Senior Associate, you will be responsible for leading and delivering on several initiatives as part of the ESG transformation agenda, to support rapidly evolving ESG landscape and its adoption in the financial market. Responsibilities As a member of MSCI Data Acquisition and Collection team, you are expected to have a strong interest in general Environment, Social, Governance, Climate and policy frameworks around these domains as well as regulatory trends Take active part in projects dealing with “electronification” of ESG & Climate frameworks and principles into data definitions which can be operationalized for collection Collaborate with Research teams on building data collection templates and with technology teams to translate these into implementable data models Do hands-on research with new data sets by studying company disclosures to help connect research proposals with implementable solution which are scalable Independently run analysis on data sets (either collected or from third party) to detect trends/patterns (EDA) and propose ways to build anomaly detection on new and existing content Analyze & research the historical data corrections across all ESG & Climate data and propose & implement contextual/thematic QA to detect cases that potentially may not be captured in current QA framework “Codify” data definitions with an intent to build NLP driven data extraction models (leveraging Traditional approaches/LLMs) to automate detection and extraction of “Facts” from company disclosures Help design and set-up new data collection processes and help with integration of these processes with ongoing data operations Deliver top quality data aligned with MSCI methodology, service level agreements, and regulatory requirements; Steer to improve methodology and SOP documents leveraging data and content expertise; Drive process improvements to ensure consistent data quality and efficiency, such as automation of data quality diagnostics by developing a new system/tool that will enable quality assessment of data without manual intervention; Contribute to process improvements to ensure consistent data quality and efficiency, such as automation of data quality diagnostics by developing a new system/tool which will enable quality assessment of data without manual intervention; Work with internal stakeholders and downstream teams on understanding data requirement, data QC scope and data delivery; Create reports/dashboards which provide quantitative data assessment metrics which justify recommendations. Visualization, outlier detection/analysis, data summaries, etc. Sharing plans, recommendations, summaries with management through conference calls, meetings and presentations with internal/external teams, Research and product Qualifications Analytical skills and has strong attention to details - Should have keen interest in analyzing data, process flows and quality focused Exposure of using tools such as Python/SQL etc. - Demonstrated experience in improving process/Automation through applications of Python/ML/RPA Work exposure with any of the visualization tools such as PowerBI would be preferable. Should have very good hands on skills working with advanced excel features. Self-starter and self-motivated, should be solutions focused and have the ability to work in unstructured environments Comfortable working in a team environment across hierarchies, functions and geographies Should have experience of working in Financial/technology/Business Analysis domain Knowledge about equities or financial markets in general. Exposure to ESG data would be added advantage Desired Experience 7+ years of full-time professional experience in: Experience in data quality and automation related roles, Business analysis, analyzing existing process and reengineer to achieve efficiency and improved quality, Exposure of using tools such as Pandas/SQL, Power BI etc. would be preferable Financial services experience; good to have exposure to ESG What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer committed to diversifying its workforce. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 weeks ago
7.0 - 12.0 years
25 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Description: Strong change and project management skills Stakeholder Management, Communications, Reporting Data management, data governance, and data quality management domain knowledge Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance, Data Quality Measurement and Reporting, Data Quality Issues Management. Liaise with IWPB markets and stakeholders to coordinate delivery of organizational DQ Governance objectives, and provide consultative support to facilitate progress Conduct analysis of IWPB DQ portfolio to identify thematic trends and insights, to effectively advise stakeholders in managing their respective domains Proficiency in MI reporting and visualization is strongly preferred Proficiency in Change and Project Management is strongly preferred. Ability to prepare programme update materials and present for senior stakeholders, with prompt response any issues / escalations Strong communications and Stakeholder Management skills: Should be able to work effectively and maintain strong working relationships as an integral part of a larger team 8+ yrs of relevant experience preferred.
Posted 2 weeks ago
3.0 - 5.0 years
20 - 25 Lacs
Pune
Work from Office
We are currently seeking an experienced professional to join our team in the role of Senior Business Analyst. In this role, you will: Domain expertise within the relevant financial services area Screening domain expertise Working with delivery teams to translate the end user requirements to application/system requirements Managing overall delivery of artifacts as per schedule plan Coordination with Business and IT teams to understand the requirement, overall management of deliverable items Reviewing functional / technical documents prepared by Business / Technical teams and ensuring version maintenance of same Requirements To be successful in this role, you should meet the following requirements: Capturing, analysing and documenting functional requirements Ability to convert functional requirement into technical specification Business process modelling investigating operational requirements, issues and opportunities to seek effective solutions, automation of processes Preparation of test scenarios and ability to provide effective solutions Visualizing the source data by running SQLs to identify data quality issues Providing support and solutions for data analysis, which involve verifying the details in the data base using SQL Knowledge of Jira and Confluence Knowledge about Dev OPS and Agile methodology Ability to work in different time zones Good presentation and communication skills Good to have: Understanding on Unix and ETL tools
Posted 2 weeks ago
1.0 - 5.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Join us as a Data Engineer, PySpark This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You ll be simplifying the bank by developing innovative data driven solutions, using insight to be commercially successful, and keeping our customers and the bank s data safe and secure Participating actively in the data engineering community, you ll deliver opportunities to support the bank s strategic direction while building your network across the bank What youll do As a Data Engineer, you ll play a key role in driving value for our customers by building data solutions. You ll be carrying out data engineering tasks to build, maintain, test and optimise a scalable data architecture, as well as carrying out data extractions, transforming data to make it usable to data analysts and scientists, and loading data into data platforms. You ll also be: Developing comprehensive knowledge of the bank s data structures and metrics, advocating change where needed for product development Practicing DevOps adoption in the delivery of data engineering, proactively performing root cause analysis and resolving issues Collaborating closely with core technology and architecture teams in the bank to build data knowledge and data solutions Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions Sourcing new data using the most appropriate tooling and integrating it into the overall solution to deliver for our customers The skills youll need To be successful in this role, you ll need a good understanding of data usage and dependencies with wider teams and the end customer, as well as experience of extracting value and features from large scale data. Youll need at least five years of experience in PySpark, Python, SQL, CICD, Gitlab and AWS. You ll also demonstrate: Experience of ETL technical design, including data quality testing, cleansing and monitoring, and data warehousing and data modelling capabilities Experience of using programming languages alongside knowledge of data and software engineering fundamentals Good knowledge of modern code development practices Strong communication skills with the ability to proactively engage with a wide range of stakeholders Hours 45 Job Posting Closing Date: 11/07/2025
Posted 2 weeks ago
1.0 - 6.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Join Us in the International Seller Services Central Analytics Team and Tackle Data Warehousing and Reporting Challenges! We are a dedicated group of BIE, DE, DS, AS and BAs within the Seller Services organization at Amazon. Our mission is to deliver data-driven analytical solutions that empower sellers to thrive on Amazons global platform. We specialize in constructing and maintaining intricate standard data pipelines that are used by thousands of users across the globe to measure, derive and analyze performance of the sellers selling on Amazon. We are currently in search of a brilliant, self-driven, and seasoned Data Engineer I to join our team. In this role, you will have the opportunity to work on a major data warehouse migration project. This project includes building extensive data models and complex ETL pipelines and collaboration with global business and technical leaders As a Data Engineer at Amazon, you will work collaboratively with senior DE, BIE, product manager and other stakeholders to build data solutions that are stable and performant. Specifically, you will transport and process data between databases and files into databases, understand the business logic and design the suitable data model and drive resolution of data quality issues with source teams. About the team The International Seller Services Central Analytics (ISS-CA) team is a highly specialized, cross-functional group tasked with driving critical business insights and breakthroughs. Composed of experts from diverse backgrounds, including scientists, economists, Business Intelligence Engineers, Data Engineers, and Business Analysts, this team is at the forefront of leveraging massive-scale data, advanced analytics, and machine learning techniques. Bachelors degree 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more scripting language (e.g., Python, KornShell) Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Knowledge of AWS Infrastructure Experience with database, data warehouse or data lake solutions Experience with industrial database migration
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
SAP MDG Experience in SAP MDG EhP6 & MDG 7.0/8.0 (Preferably 9.0) (10+ Years of experience) Extensive ECC and/or S/4 HANA experience, Worked on at least 2 MDG projects Expertise in Implementation of SAP MDG Solution for masters like Customer, Vendor, Material, etc. Expertise in Data Model Enhancement, Data Transfer (DIF/DEF), Data Replication Framework (DRF), Business Rules Framework plus (BRFplus). Experience in Configuration rule based Workflow and in Integrating business process requirements with the technical implementation of SAP Master Data Governance. Experience in User interface modelling (Design and Creation of UI, Value restriction, Define navigation elements of type Hyperlink or Push button, Data quality, Validation and Derivation rules). Experience in Process Modelling (Entity, Business Activity change, Request type, Workflow, Edition type, Relationship, Data replication techniques, SOA service, ALE connection, Key & value mapping, Data transfer, Export & import master data, Convert master data). Expert knowledge in activation and configuration of the MDG modules & components. SAP ERP logistics knowledge (SAP modules SD or MM), especially master data is required.
Posted 2 weeks ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Req ID: 330198 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Job Title: Snowflake Engineer Join our dynamic data team as a Snowflake Engineer, where youll design and implement scalable cloud data solutions tailored for the financial services sector. Youll work closely with cross-functional teams to optimize data pipelines, ensure data integrity, and support analytics initiatives that drive business insights. Key Responsibilities: Overall: Design, develop, and maintain Snowflake-based data architectures. Build robust ETL/ELT pipelines for financial data processing. Collaborate with data analysts, engineers, and business stakeholders. Ensure data security, compliance, and performance optimization. Data Engineering & Development Build and optimize ELT/ETL pipelines using SQL, Python, or third-party tools. Develop reusable components, frameworks, and automation scripts to streamline data operations. Implement data quality checks, validation routines, and monitoring mechanisms. Performance Tuning & Optimization Monitor and tune Snowflake workloads for performance, cost-efficiency, and scalability. Analyze query performance and optimize SQL scripts, warehouse sizing, and caching strategies. Implement best practices for partitioning, clustering, and materialized views. Security & Governance Configure and manage Snowflake roles, access controls, and data masking policies. Ensure compliance with data governance standards, including lineage, auditing, and retention policies. Collaborate with security teams to implement encryption, secure data sharing, and regulatory compliance. Qualifications: 5+ years of hands-on experience with Snowflake is required. Strong SQL and data modeling skills is required. Experience in financial services is preferred. Familiarity with Python, dbt, or similar tools is a plus.
Posted 2 weeks ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Summary About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where its applicable to the position. Job Description Responsibilities : Design and Development: Design, and develop robust, scalable, and efficient data pipelines. Design and manage platform solutions to support data engineering needs to ensure seamless integration and performance. Write clean, efficient, and maintainable code. Data Management and Optimization: Ensure data quality, integrity, and security across all data pipelines. Optimize data processing workflows for performance and cost-efficiency. Develop and maintain comprehensive documentation for data pipelines and related processes. Innovation and Continuous Improvement: Stay current with emerging technologies and industry trends in big data and cloud computing. Propose and implement innovative solutions to improve data processing and analytics capabilities. Continuously evaluate and improve existing data infrastructure and processes. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. 5+ years of experience in software engineering with a focus on data engineering and building data platform Strong programming experience using Python or Java . Proven experience with Big data technologies like Apache Spark , Amazon EMR , Apache Iceberg , Amazon Redshift , etc or Similar technologies Proven experience in RDBMS (Postgres, MySql, etc) and NoSQL (MongoDB, DynamoDB, etc) database Proficient in AWS cloud services (e.g., Lambda , S3 , Athena , Glue ) or comparable cloud technologies. In-depth understanding of SDLC best practices, including Agile methodologies , code reviews, and CI/CD . Experience working in Event driven and Serverless Architecture Experience with platform solutions and containerization technologies (e.g., Docker , Kubernetes). Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. Strong communication skills, both written and verbal. Why Join Us: Opportunity to work with cutting-edge technologies and innovative projects. Collaborative and inclusive work environment. Competitive salary and benefits package. Professional development opportunities and career growth.
Posted 2 weeks ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Summary About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where its applicable to the position. Job Description Responsibilities : Design and Development: Design, and develop robust, scalable, and efficient data pipelines. Design and manage platform solutions to support data engineering needs to ensure seamless integration and performance. Write clean, efficient, and maintainable code. Data Management and Optimization: Ensure data quality, integrity, and security across all data pipelines. Optimize data processing workflows for performance and cost-efficiency. Develop and maintain comprehensive documentation for data pipelines and related processes. Innovation and Continuous Improvement: Stay current with emerging technologies and industry trends in big data and cloud computing. Propose and implement innovative solutions to improve data processing and analytics capabilities. Continuously evaluate and improve existing data infrastructure and processes. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. 5+ years of experience in software engineering with a focus on data engineering and building data platform Strong programming experience using Python or Java . Proven experience with Big data technologies like Apache Spark , Amazon EMR , Apache Iceberg , Amazon Redshift , etc or Similar technologies Proven experience in RDBMS (Postgres, MySql, etc) and NoSQL (MongoDB, DynamoDB, etc) database Proficient in AWS cloud services (e.g., Lambda , S3 , Athena , Glue ) or comparable cloud technologies. In-depth understanding of SDLC best practices, including Agile methodologies , code reviews, and CI/CD . Experience working in Event driven and Serverless Architecture Experience with platform solutions and containerization technologies (e.g., Docker , Kubernetes). Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment. Strong communication skills, both written and verbal.
Posted 2 weeks ago
10.0 - 12.0 years
20 - 25 Lacs
Hyderabad
Work from Office
HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Assistant Vice President Principal responsibilities The role holder will support Data Finance in maintaining and governing data which have been setup for Data Quality/Remediation process within Finance to support across Finance. Primary responsibilities will include. Develop a strong working understanding of Data Quality processes, Issue Management, and broader Data Management pillars; apply this knowledge to real-world problem-solving, not just theoretical frameworks. Conduct hands-on business-data analysis at both a MVP-level use cases and complex scenarios depending on priority/deliverable timelines. Support (where required) Governance and Data leads driving working group outcomes, hosting data forums, and managing meeting logistics including documentation and ownership of follow-up actions. Take ownership of stakeholder relationships, using strong interpersonal and communication skills to manage expectations, educate, resolve conflicts, and deliver results across cross-functional teams. Demonstrate strong self-management by proactively prioritizing tasks, driving deliverables independently and maintaining accountability. Experience in MI Reporting work with entry level ability code SQL, Python and/or Alteryx. Communicating with stakeholders across functions in diverse locations and establishing working relationships. Absorbing concepts, defining the approach, and developing processes with limited handholding & Pressure to deliver within fixed timelines in an environment of ambiguity. Requirements Postgraduate/Graduate with 10-12 years of experience within Banking with some data experience. Strong analytical and problem-solving skills. Excellent stakeholder engagement and management skills. Ability to navigate within the organization. Experience in analyzing and interpreting large volumes of data and information. Flexibility to work in accordance with Business requirements this may include working outside of normal hours. Must be experienced in working under pressure on multiple process improvement projects. Understanding of HSBC Group structures, values, processes, and objectives. Experience with using project/collaboration tools such as JIRA, SharePoint, and Confluence. Proficient in MS Excel and Power point. Knowledge in using project/collaboration tools such as JIRA, SharePoint, and Confluence.
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
We are seeking an experienced Data Platform Reliability Engineer to lead our efforts in designing, implementing, and maintaining highly reliable data infrastructure. The ideal candidate will bring extensive expertise in building enterprise-grade data platforms with a focus on reliability engineering, governance, and SLA/SLO design. This role will be instrumental in developing advanced monitoring solutions, including LLM-powered systems, to ensure the integrity and availability of our critical data assets. Platform Architecture and Design Design and architect scalable, fault-tolerant data platforms leveraging modern technologies like Snowflake, Databricks, and cloud-native services Establish architectural patterns that ensure high availability and resiliency across data systems Develop technical roadmaps for platform evolution with reliability as a core principle Reliability Engineering Implement comprehensive SLA/SLO frameworks for data services Design and execute chaos engineering experiments to identify and address potential failure modes Create automated recovery mechanisms for critical data pipelines and services Establish incident management processes and runbooks Monitoring and Observability Develop advanced monitoring solutions, including LLM-powered anomaly detection Design comprehensive observability strategies across the data ecosystem Implement proactive alerting systems to identify issues before they impact users Create dashboards and visualization tools for reliability metrics Data Quality and Governance Establish data quality monitoring processes and tools Implement data lineage tracking mechanisms Develop automated validation protocols for data integrity Collaborate with data governance teams to ensure compliance with policies Innovation and Improvement Research and implement AI/ML approaches to improve platform reliability Lead continuous improvement initiatives for data infrastructure Mentor team members on reliability engineering best practices Stay current with emerging technologies and reliability patterns in the data platform space Qualifications 10+ years of experience in data platform engineering or related fields Proven expertise with enterprise data platforms (Snowflake, Databricks, etc.) Strong background in reliability engineering, SRE practices, or similar disciplines Experience implementing data quality monitoring frameworks Knowledge of AI/ML applications for system monitoring and reliability Excellent communication skills and ability to translate technical concepts to diverse stakeholders
Posted 2 weeks ago
6.0 - 11.0 years
20 - 25 Lacs
Chennai
Work from Office
Join us as a Customer Service & Operations Unit Leader If you have strong team leadership experience, this is a chance to lead, manage and coach a team to deliver outstanding customer service through telemarketing, telesales and retention activities We ll look to you to lead by example by making customer calls to identify and understand customer needs and help them by delivering solutions that are fit for purpose You ll be recognised for delivering a first class outbound and inbound telephony and digital service to our prospective customers and existing customers What youll do As a Customer Service & Operations Unit Leader, you ll be making sure that customer needs and priorities are identified by the team to ensure appropriate services are offered during customer calls. You ll lead and manage the team to deliver against new business targets including sales appointments booked, telephony sales made and new business income, while encouraging them to work together to build trust and long term sustainable value for our customers and colleagues. Your other responsibilities will include: Developing the capability of your team through observations, feedback and coaching Establishing effective working relationships with key stakeholders within the business and the broader bank to plan and deliver targeted telemarketing campaigns Developing and maintaining a forward looking telemarketing campaign planner Analysing the results of the team to make sure that data quality utilised for telemarketing campaigns is optimised Providing effective planning and management of your team s workload Were offering this role at associate vice president level The skills youll need We re looking for someone with strong people management, leadership and coaching skills with the ability to deliver through people. You ll have experience of managing telephony teams to generate and close sales leads, achieve stretching targets and deliver sales through service, preferably in a business-to-business environment. You ll also need: Proficient in banking payment platforms such as Bankline and Payit, with a strong grasp of back-office operations and fundamental reconciliation processes. Demonstrated ability to manage customer complaints with professionalism, ensuring prompt resolution and precise documentation. Experience of planning and delivering targeted telemarketing campaigns The ability to lead by example through calls to customers to identify and understand their needs and recommend appropriate solutions An excellent customer focus and the proven ability to exceed customer expectations Strong interpersonal and communication skills The ability to develop effective working relationships with colleagues and stakeholders Good time management, planning and organisational skills Hours 45 Job Posting Closing Date: 18/07/2025
Posted 2 weeks ago
5.0 - 10.0 years
8 - 9 Lacs
Pune
Work from Office
Req ID: 332236 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Consulting-Technical analyst with ETL,GCP using Pyspark to join our team in Pune, Mah r shtra (IN-MH), India (IN). Key Responsibilities: Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis. Data Transformation: Utilizing PySpark for data manipulation, cleansing, enrichment, and validation. Performance Optimization: Ensuring the performance and scalability of data processing jobs on GCP. Collaboration: Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Data Quality and Governance: Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP. Troubleshooting and Support: Diagnosing and resolving issues related to data pipelines and infrastructure. Staying Updated: Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering. Required Skills: GCP Expertise: Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc. PySpark Proficiency: Demonstrated experience in using PySpark for data processing, transformation, and analysis. Python Programming: Solid Python programming skills for data manipulation and scripting. Data Modeling and ETL: Experience with data modeling, ETL processes, and data warehousing concepts. SQL: Proficiency in SQL for querying and manipulating data in relational databases. Big Data Concepts: Understanding of big data principles and distributed computing concepts. Communication and Collaboration: Ability to effectively communicate technical solutions and collaborate with cross-functional teams
Posted 2 weeks ago
3.0 - 7.0 years
8 - 9 Lacs
Kolkata
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC Learn more about us . s Preparation of Functional Requirement Specification (FRS) of the proposed Web GIS Applications. Identifying integration requirement with other departments and agencies and finalization of integration specifications. Preparation of Test Use cases Application and Functional testing Application Hosting Web GIS Server Administration and User Management. Mandatory skill sets API handling, Python, Javascript, HTML, CSS Preferred skill sets API handling, Python, Javascript, HTML, CSS Years of experience required 3 7 yrs Education qualification B.tech/MCA/MBA Education Degrees/Field of Study required Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred Required Skills API Standards, Hyper Text Markup Language (HTML), JavaScript, Python (Programming Language) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Travel Requirements Government Clearance Required?
Posted 2 weeks ago
0.0 - 8.0 years
10 - 11 Lacs
Pune
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. Those in artificial intelligence and machine learning at PwC will focus on developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. Your work will involve designing and optimising algorithms, models, and systems to enable intelligent decisionmaking and automation. & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design and implement agentic AI architectures, including planning, memory, tool use, multiagent collaboration, and feedback loops. Build and integrate with large language models (LLMs), including finetuning, prompt engineering, and retrievalaugmented generation (RAG). Develop agents capable of autonomous task execution, dynamic decisionmaking, and longhorizon planning. Lead development of tools for selfreflection, memory persistence, and contextual awareness in AI systems. Create or improve pipelines for multimodal generative AI, such as texttoimage, code generation, or synthetic media creation. Work with APIs, opensource tools (LangChain, AutoGen, OpenAI, Hugging Face), and cloud infrastructure to deploy productiongrade agents. Collaborate with product, design, and research teams to align capabilities with user needs and ethical AI practices. Stay up to date with the latest research and developments in agentic AI, LLMs, and generative A Mandatory skill sets LangChain, AutoGen, Preferred skill sets Langgraph,Langchain Years of experience required 37 Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Generative AI Accepting Feedback, Accepting Feedback, Active Listening, AI Implementation, Analytical Thinking, C++ Programming Language, Communication, Complex Data Analysis, Creativity, Data Analysis, Data Infrastructure, Data Integration, Data Modeling, Data Pipeline, Data Quality, Deep Learning, Embracing Change, Emotional Regulation, Empathy, GPU Programming, Inclusion, Intellectual Curiosity, Java (Programming Language), Learning Agility, Machine Learning {+ 25 more} No
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai
Work from Office
Key Responsibilities: Design, develop, and maintain robust, scalable, and efficient data pipelines to collect, process, and store structured and unstructured data. Build and optimize data warehouses, data lakes, and ETL/ELT workflows. Integrate data from multiple sources including databases, APIs, and streaming platforms. Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor and troubleshoot data pipeline performance and failures. Implement data governance and compliance policies. Automate data workflows and implement data orchestration tools (e.g., Apache Airflow). Optimize storage and query performance in cloud and on-premises environments. Keep up to date with emerging data engineering tools, techniques, and best practices.
Posted 2 weeks ago
7.0 - 13.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Key Responsibilities Leads the development of business requirements for data curation through collaboration with relevant stakeholders within and outside RWDMA. Maintain strong connections with analytical groups and R&D Data Platform teams to ensure seamless data integration and usage. Develop and implement the vision and strategy for the design of the framework to consistently curate (e.g. pre-process, harmonize, wrangle, contextualize and/or anonymize) data in the right manner to the right people to drive value in alignment with Disease Area Strategies and other key R&D priority areas. Partner strongly with the Disease Area Heads in R&D and their teams to ensure the provisioning of required and high-quality curated datasets to deliver their disease and/or asset-level data strategy and modelling plans. Enable bi-directional transparency between Business and R&D Tech to ensure alignment of strategies, achieve business objectives/outcomes, and maintain service levels in line with business needs, while prioritizing data privacy and security. Ensure all datasets meet analysis-ready and privacy requirements by performing necessary data curation activities (e.g. pre-process, contextualize and/or anonymize). Ensure that datasets are processed to meet conditions mentioned in the approved data re-use request (e.g., remove subjects from countries that do not allow re-use). Write clean, readable code. Ensure that deliverables are appropriately quality controlled, documented, and when required, can be handed over to R&D Tech team for production pipeline implementation. Transforming raw healthcare data into products that can be used to catalyze the work of the wider RWDMA and Biostatistics teams and be leveraged by our diverse group of stakeholders to generate insights. Ensuring data quality, integrity, and security across various data sources. Supporting data-driven decision-making processes that enhance patient outcomes and operational efficiencies. Education Requirements Advanced degree (Masters or Ph.D.) in Life Sciences, Epidemiology, Biostatistics, Public Health, Computer Sciences, Mathematics, Statistics or a related field with applicable experience . Job Related Experience Expertise to translate business needs into technical data requirements and processes. Proven track record of leading and managing high-performing data engineering teams Experience in data engineering and curation, with majority of experience on real-world data in the healthcare or pharmaceutical industry. Proven ability to handle and process large datasets efficiently, ensuring data privacy. Proficiency in handling structured, semi-structured, and unstructured data while ensuring data privacy. Understanding of data governance principles and practices with a focus on data privacy. Innovative mindset and willingness to challenge status quo, solution-oriented mindset Fluent in written and spoken English to effectively communicate and able to articulate complex concepts to diverse audiences Experience of working in global matrix environment and managing stakeholders effectively
Posted 2 weeks ago
13.0 - 16.0 years
32 - 40 Lacs
Bengaluru
Work from Office
Key Responsibilities Facilitating the integration of diverse data types and sources to provide a comprehensive view of patient health and treatment outcomes. Provide coaching and peer review to ensure that the team s work reflects the industry s best practices for data curation activities, including data privacy and anonymization standards. Ensure all datasets meet analysis-ready and privacy requirements by performing necessary data curation activities (e.g. pre-process, contextualize and/or anonymize). Ensure that datasets are processed to meet conditions mentioned in the approved data re-use request (e.g., remove subjects from countries that do not allow re-use). Write clean, readable code. Ensure that deliverables are appropriately quality controlled, documented, and when required, can be handed over to R&D Tech team for production pipeline implementation. Transforming raw healthcare data into products that can be used to catalyze the work of the wider RWDMA and Biostatistics teams and be leveraged by our diverse group of stakeholders to generate insights. Ensuring data quality, integrity, and security across various data sources. Supporting data-driven decision-making processes that enhance patient outcomes and operational efficiencies. Education Requirements Advanced degree (Masters or Ph.D.) in Life Sciences, Epidemiology, Biostatistics, Public Health, Computer Sciences, Mathematics, Statistics or a related field with applicable experience . Job Related Experience Experience in data engineering and curation, with majority of experience on real-world data in the healthcare or pharmaceutical industry. Proven ability to handle and process large datasets efficiently, ensuring data privacy. Proficiency in handling structured, semi-structured, and unstructured data while ensuring data privacy. Understanding of data governance principles and practices with a focus on data privacy. Innovative mindset and willingness to challenge status quo, solution-oriented mindset Fluent in written and spoken English to effectively communicate and able to articulate complex concepts to diverse audiences Experience of working in global matrix environment and managing stakeholders effectively Experience in complex batch processing, Azure Data Factory, Databricks, Airflow, Delta Lake, PySpark, Pandas and other python dataframe libraries including how to apply them to achieve industry standards and data privacy. Proven ability to collaborate with cross-functional teams. Strong communication skills to present curated data.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Must have 1. SAP Native Hana modelling experience with good knowledge on architecture and different features of HANA DB and HANA on cloud, SLT, SDI (e.g. Hands on experience with calculated columns, Input parameters/Variables, Performance optimization techniques) 2. SAP Analytics SAP BW, SAP BW on HANA, SAP Native HANA, ADSO (Advanced Data Storage Options), CP (Composite Providers), Cube, Routines, DSO, InfoObjects, Multiproviders, Infosets 3. SAP Extractors, ABAP 4. Handson experience of SQL queries, performance optimization, delta/SCD logic and able to handle complex transformation logics. 5. Working on SLT 6. Able to independently handle the ETL activities including loading the data from SAP ECC, third party system to HANA, flat file and other business formats. 7. Understanding of Data Profiling, Data Quality, Data Integrator and platform transformations. 8. Handling SAP BODS, problem definition, Architecture/Design Detailing of Processes. Mandatory skill sets Native HANA, BW on HANA, SLT, SQL Certifications (any one) SAP Native HANA SAP BW on HANA SAP BI7.0 SAP BW3.5 Preferred skill sets Good to have Working knowledge of Python BODS Years of experience required Experience 510 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills SAP HANA Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} No
Posted 2 weeks ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Must have 1. SAP Native Hana modelling experience with good knowledge on architecture and different features of HANA DB and HANA on cloud, SLT, SDI (e.g. Hands on experience with calculated columns, Input parameters/Variables, Performance optimization techniques) 2. SAP Analytics SAP BW, SAP BW on HANA, SAP Native HANA, ADSO (Advanced Data Storage Options), CP (Composite Providers), Cube, Routines, DSO, InfoObjects, Multiproviders, Infosets 3. SAP Extractors, ABAP 4. Handson experience of SQL queries, performance optimization, delta/SCD logic and able to handle complex transformation logics. 5. Working on SLT 6. Able to independently handle the ETL activities including loading the data from SAP ECC, third party system to HANA, flat file and other business formats. 7. Understanding of Data Profiling, Data Quality, Data Integrator and platform transformations. 8. Handling SAP BODS, problem definition, Architecture/Design Detailing of Processes. Mandatory skill sets Native HANA, BW on HANA, SLT, SQL Certifications (any one) SAP Native HANA SAP BW on HANA SAP BI7.0 SAP BW3.5 Preferred skill sets Good to have Working knowledge of Python BODS Years of experience required Experience 510 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills SAP HANA Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} No
Posted 2 weeks ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Must have 1. SAP Native Hana modelling experience with good knowledge on architecture and different features of HANA DB and HANA on cloud, SLT, SDI (e.g. Hands on experience with calculated columns, Input parameters/Variables, Performance optimization techniques) 2. SAP Analytics SAP BW, SAP BW on HANA, SAP Native HANA, ADSO (Advanced Data Storage Options), CP (Composite Providers), Cube, Routines, DSO, InfoObjects, Multiproviders, Infosets 3. SAP Extractors, ABAP 4. Handson experience of SQL queries, performance optimization, delta/SCD logic and able to handle complex transformation logics. 5. Working on SLT 6. Able to independently handle the ETL activities including loading the data from SAP ECC, third party system to HANA, flat file and other business formats. 7. Understanding of Data Profiling, Data Quality, Data Integrator and platform transformations. 8. Handling SAP BODS, problem definition, Architecture/Design Detailing of Processes. Mandatory skill sets Native HANA, BW on HANA, SLT, SQL Certifications (any one) SAP Native HANA SAP BW on HANA SAP BI7.0 SAP BW3.5 Preferred skill sets Good to have Working knowledge of Python BODS Years of experience required Experience 510 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills SAP HANA Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} No
Posted 2 weeks ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Must have 1. SAP Native Hana modelling experience with good knowledge on architecture and different features of HANA DB and HANA on cloud, SLT, SDI (e.g. Hands on experience with calculated columns, Input parameters/Variables, Performance optimization techniques) 2. SAP Analytics SAP BW, SAP BW on HANA, SAP Native HANA, ADSO (Advanced Data Storage Options), CP (Composite Providers), Cube, Routines, DSO, InfoObjects, Multiproviders, Infosets 3. SAP Extractors, ABAP 4. Handson experience of SQL queries, performance optimization, delta/SCD logic and able to handle complex transformation logics. 5. Working on SLT 6. Able to independently handle the ETL activities including loading the data from SAP ECC, third party system to HANA, flat file and other business formats. 7. Understanding of Data Profiling, Data Quality, Data Integrator and platform transformations. 8. Handling SAP BODS, problem definition, Architecture/Design Detailing of Processes. Mandatory skill sets Native HANA, BW on HANA, SLT, SQL Certifications (any one) SAP Native HANA SAP BW on HANA SAP BI7.0 SAP BW3.5 Preferred skill sets Good to have Working knowledge of Python BODS Years of experience required Experience 510 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills SAP HANA Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} No
Posted 2 weeks ago
5.0 - 10.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Req ID: 331216 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AEP Developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). About the Role: We are seeking a highly skilled and experienced Senior Adobe Experience Platform (AEP) Developer to join our growing team. In this role, you will play a critical part in Support & maintenance, design, development, and implementation of innovative customer data solutions within the AEP ecosystem. You will be responsible for building and maintaining robust data pipelines, integrating data from various sources, and developing custom solutions to meet the unique needs of our business. Key Responsibilities: AEP Platform Expertise: Deep understanding of the AEP suite, Experience Data Model (XDM), Data Science Workspace, and other relevant modules. Proficient in AEP APIs, Web SDKs, and integrations with other MarTech platforms (Adobe Target, CJA, AJO, Adobe Campaign etc.). Experience with AEP data ingestion, transformation, and activation. Strong understanding of data modeling principles and best practices within the AEP ecosystem. Data Engineering & Development: Design, develop, and maintain high-quality data pipelines and integrations using AEP and other relevant technologies. High level knowledge and understanding to develop and implement custom solutions within the AEP environment using scripting languages (e.g., JavaScript, Python) and other relevant tools. Troubleshoot and resolve data quality issues and performance bottlenecks. Ensure data accuracy, consistency, and security across all stages of the data lifecycle. Customer Data Solutions: Collaborate with cross-functional teams (e.g., marketing, product, data science) to understand the issues and support to fix problems. Support and maintenance of developed data-driven solutions to improve customer experience, personalize marketing campaigns, and drive business growth. Analyze data trends and provide insights to inform business decisions. Project Management & Collaboration: Contribute to the planning and execution of AEP projects, ensuring timely delivery and adherence to project timelines and budgets. Effectively communicate technical concepts to both technical and non-technical audiences. Collaborate with team members and stakeholders to ensure successful project outcomes. Stay Updated: Stay abreast of the latest advancements in AEP and related technologies. Continuously learn and expand your knowledge of data management, data science, and customer experience. Qualifications: Education: Bachelor s degree in computer science, Engineering, or a related field (or equivalent experience). Experience: Overall IT experience of 5+ years with 3-4 years of hands-on experience with Adobe Experience Platform (AEP). Technical Skills: 3+ Strong proficiency in JavaScript, or other relevant programming languages. 3 years of experience with RESTful APIs, JSON, and XML. 3+ years of experience with data warehousing, data modeling, and data quality best practices. 3+ years of experience in Tag management system like Adobe Launch 2+ years of experience working with WebSDK Experience of Adobe Analytics is a plus. Knowledge and experience on leveraging Python libraries and tools for data cleaning and analysis is a plus Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus. Soft Skills: Excellent analytical and problem-solving skills. Strong communication, interpersonal, and collaboration skills. Ability to work independently and as part of a team. Detail-oriented and results-driven. Strong organizational and time-management skills. Bonus Points: Experience with other Adobe Marketing Cloud solutions (e.g., Adobe Analytics, Adobe Target). Experience with Agile development methodologies. Experience with data visualization tools (e.g., Tableau, Power BI). Experience with data governance and compliance (e.g., GDPR, CCPA). Understanding of Real-time Customer Data Platform (RT-CDP)
Posted 2 weeks ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About Toast We are a rapidly growing company that s revolutionizing the way the restaurant industry does business by pairing technology with an extraordinary commitment to customer success. We help restaurants streamline operations, increase revenue, and deliver amazing guest experiences through our platform that combines restaurant point of sale, guest-facing technology, and award-winning customer support. Join us as we empower the restaurant community to delight guests, do what they love, and thrive. Bready * to make a change? As a Data & BI Analyst, you will be responsible for transforming complex data into clear, actionable insights that drive strategic decisions across the business. You ll build and maintain dashboards, perform ad-hoc analyses, and support teams with data-driven reporting. Using tools like SQL, Python, Hex, and Looker, you ll monitor key performance indicators, identify trends, and help uncover opportunities and risks. Your work will play a critical role in enabling informed decision-making across functions such as risk, and operations Responsibilities Collaborate with business and operations teams to define, track, and maintain KPIs and KRIs that align with company-wide goals Analyze large and complex datasets using SQL and Python to surface trends, flag risks, and recommend optimizations Build and manage collaborative, self-serve dashboards and data applications using Hex to drive decision-making at all levels Translate raw data into clear insights, reports, and visualizations that deliver business value and operational impact Conduct root-cause and variance analysis on performance data to highlight key issues and opportunities Develop and optimize ETL pipelines and support scalable data workflows using Snowflake and Python-based tooling Ensure high data quality and governance through documentation, validation checks, and best practices Act as a data translator between technical teams and business stakeholders, fostering a data-driven culture across the organization Do you have the right ingredients*? (Qualifications): Bachelor s degree in a technical or analytical field (e.g., Data Science, Statistics, Computer Science, Engineering, Economics) 2-5 years of experience in a Data Analyst, BI Analyst, or similar analytics role Strong proficiency in Python for data analysis, automation, and reporting (e.g., pandas, NumPy, seaborn, matplotlib) Advanced SQL skills for querying large datasets across multiple sources Hands-on experience with cloud data warehouses, particularly Snowflake Experience building interactive, real-time dashboards in Hex or similar BI tools Familiarity with Git and collaborative analytics workflows (e.g., version-controlled notebooks or analytics repos) Strong ability to analyze and own data end-to-end slicing, interpreting, and bringing clarity to business and operational questions High attention to detail and a strong sense of data accountability Understanding and experience building credit risk management dashboards Diversity, Equity, and Inclusion is Baked into our Recipe for Success At Toast, our employees are our secret ingredient when they thrive, we thrive. The restaurant industry is one of the most diverse, and we embrace that diversity with authenticity, inclusivity, respect, and humility. By embedding these principles into our culture and design, we create equitable opportunities for all and raise the bar in delivering exceptional experiences. We Thrive Together We embrace a hybrid work model that fosters in-person collaboration while valuing individual needs. Our goal is to build a strong culture of connection as we work together to empower the restaurant community. To learn more about how we work globally and regionally, check out: https: / / careers.toasttab.com / locations-toast . Apply today! Toast is committed to creating an accessible and inclusive hiring process. As part of this commitment, we strive to provide reasonable accommodations for persons with disabilities to enable them to access the hiring process. ------ For roles in the United States, It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France