Jobs
Interviews

3847 Data Quality Jobs - Page 50

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

8 - 12 Lacs

Jharkhand

Work from Office

We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security. Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies. Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role. Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights. Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Raipur

Work from Office

Job Overview As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kolhapur

Work from Office

Job Overview As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kochi

Work from Office

Job Overview As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Vijayawada

Work from Office

Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Visakhapatnam

Work from Office

Job Overview Branch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Solapur

Work from Office

Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility. Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Gurugram, Delhi / NCR

Work from Office

Job Description We are seeking a highly skilled Senior Data Engineer with deep expertise in AWS data services, data wrangling using Python & PySpark, and a solid understanding of data governance, lineage, and quality frameworks. The ideal candidate will have a proven track record of delivering end-to-end data pipelines for logistics, supply chain, enterprise finance, or B2B analytics use cases. Role & responsibilities. Design, build, and optimize ETL pipelines using AWS Glue 3.0+ and PySpark. Implement scalable and secure data lakes using Amazon S3, following bronze/silver/gold zoning. Write performant SQL using AWS Athena (Presto) with CTEs, window functions, and aggregations. Take full ownership from ingestion transformation validation metadata documentation dashboard-ready output. Build pipelines that are not just performant, but audit-ready and metadata-rich from the first version. Integrate classification tags and ownership metadata into all columns using AWS Glue Catalog tagging conventions. Ensure no pipeline moves to QA or BI team without validation logs and field-level metadata completed. Develop job orchestration workflows using AWS Step Functions integrated with EventBridge or CloudWatch. Manage schemas and metadata using AWS Glue Data Catalog. Take full ownership from ingestion transformation validation metadata documentation dashboard-ready output. Ensure no pipeline moves to QA or BI team without validation logs and field-level metadata completed. Enforce data quality using Great Expectations, with checks for null %, ranges, and referential rules. Ensure data lineage with OpenMetadata or Amundsen and add metadata classifications (e.g., PII, KPIs). Collaborate with data scientists on ML pipelines, handling JSON/Parquet I/O and feature engineering. Must understand how to prepare flattened, filterable datasets for BI tools like Sigma, Power BI, or Tableau. Interpret business metrics such as forecasted revenue, margin trends, occupancy/utilization, and volatility. Work with consultants, QA, and business teams to finalize KPIs and logic. Build pipelines that are not just performant, but audit-ready and metadata-rich from the first version. Integrate classification tags and ownership metadata into all columns using AWS Glue Catalog tagging conventions. Preferred candidate profile Strong hands-on experience with AWS: Glue, S3, Athena, Step Functions, EventBridge, CloudWatch, Glue Data Catalog. Programming skills in Python 3.x, PySpark, and SQL (Athena/Presto). Proficient with Pandas and NumPy for data wrangling, feature extraction, and time series slicing. Strong command over data governance tools like Great Expectations, OpenMetadata / Amundsen. Familiarity with tagging sensitive metadata (PII, KPIs, model inputs). Capable of creating audit logs for QA and rejected data. Experience in feature engineering rolling averages, deltas, and time-window tagging. BI-readiness with Sigma, with exposure to Power BI / Tableau (nice to have).

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Gurugram

Hybrid

Job Description We are seeking a highly skilled Senior Data Engineer with deep expertise in AWS data services, data wrangling using Python & PySpark, and a solid understanding of data governance, lineage, and quality frameworks. The ideal candidate will have a proven track record of delivering end-to-end data pipelines for logistics, supply chain, enterprise finance, or B2B analytics use cases. Role & responsibilities Design, build, and optimize ETL pipelines using AWS Glue 3.0+ and PySpark. Implement scalable and secure data lakes using Amazon S3, following bronze/silver/gold zoning. Write performant SQL using AWS Athena (Presto) with CTEs, window functions, and aggregations. Take full ownership from ingestion transformation validation metadata documentation dashboard-ready output. Build pipelines that are not just performant, but audit-ready and metadata-rich from the first version. Integrate classification tags and ownership metadata into all columns using AWS Glue Catalog tagging conventions. Ensure no pipeline moves to QA or BI team without validation logs and field-level metadata completed. Develop job orchestration workflows using AWS Step Functions integrated with EventBridge or CloudWatch. Manage schemas and metadata using AWS Glue Data Catalog. Take full ownership from ingestion transformation validation metadata documentation dashboard-ready output. Ensure no pipeline moves to QA or BI team without validation logs and field-level metadata completed. Enforce data quality using Great Expectations, with checks for null %, ranges, and referential rules. Ensure data lineage with OpenMetadata or Amundsen and add metadata classifications (e.g., PII, KPIs). Collaborate with data scientists on ML pipelines, handling JSON/Parquet I/O and feature engineering. Must understand how to prepare flattened, filterable datasets for BI tools like Sigma, Power BI, or Tableau. Interpret business metrics such as forecasted revenue, margin trends, occupancy/utilization, and volatility. Work with consultants, QA, and business teams to finalize KPIs and logic. Build pipelines that are not just performant, but audit-ready and metadata-rich from the first version. Integrate classification tags and ownership metadata into all columns using AWS Glue Catalog tagging conventions. Preferred candidate profile Strong hands-on experience with AWS: Glue, S3, Athena, Step Functions, EventBridge, CloudWatch, Glue Data Catalog. Programming skills in Python 3.x, PySpark, and SQL (Athena/Presto). Proficient with Pandas and NumPy for data wrangling, feature extraction, and time series slicing. Strong command over data governance tools like Great Expectations, OpenMetadata / Amundsen. Familiarity with tagging sensitive metadata (PII, KPIs, model inputs). Capable of creating audit logs for QA and rejected data. Experience in feature engineering rolling averages, deltas, and time-window tagging. BI-readiness with Sigma, with exposure to Power BI / Tableau (nice to have).

Posted 3 weeks ago

Apply

2.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

We have a full-time opportunity with our client Lenovo pay rolled through SDI for a Data Quality Analyst role with Lenovo. Client- Fulltime Lenovo Pay roll: SDI Job Description: Location: Bangalore Job Type: Full-time Role Overview Job Summary Experienced and detail-driven Data Quality Analyst with 3+ years of expertise in ensuring data accuracy, consistency, and reliability across enterprise systems. Adept at data profiling, cleansing, validation, and root cause analysis using tools such as SQL, Excel, and data quality platforms. Skilled in collaborating with cross-functional teams to enhance data governance frameworks, support data migration initiatives, and reduce data errors through proactive monitoring and process improvements. Committed to supporting data-driven decision-making with high-quality, trusted data. Technical Skills: SQL (MySQL, PostgreSQL, MS SQL Server) Python (Basic) ETL Tools Talend or Informatica Key Roles and Responsibilities: Conduct data profiling, validation, and cleansing across multiple systems to ensure data accuracy, consistency, and completeness. Develop and execute SQL queries to identify anomalies, missing values, and inconsistencies in structured datasets. Work closely with data engineers and business analysts to define data quality rules and implement automated checks and alerts. Support root cause analysis and resolution of data quality issues, documenting findings and recommending process improvements. Create and maintain data quality dashboards and reports using Excel, Power BI, or other visualization tools. Collaborate with data stewards and governance teams to ensure compliance with data standards and best practices. Assist in the design and implementation of data quality frameworks and documentation of data lineage and metadata. Participate in data migration and integration projects, validating data accuracy during transitions between systems.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad

Hybrid

Role & responsibilities Role Summary As a Data QE at PalTech you will play an important role in designing and executing test strategies for end-to-end data validation, ensuring data completeness, accuracy, and integrity across ETL processes, data warehouses, and reports. You will automate data testing using Python, validate fact and dimension tables, large datasets, file ingestions, and data exports while ensuring adherence to data security standards, including encryption and authorization. Proficiency in SQL, Python, ETL/ELT tools, and reporting platforms like Power BI or Tableau is essential. The role requires strong analytical skills, collaboration with cross-functional teams, and the ability to enhance testing processes through automation and best practices. Key Responsibilities Create test strategies, test plans, business scenarios, data validation scripts for end-to-end data validation. Verify data completeness, accuracy, and integrity throughout the ETL processes and reports. Evaluate the performance of ETL jobs and ensure that they meet defined SLAs Automate the data testing process using Python or other technologies Experienced in validating various types of fact tables and dimension tables DWH skills are a must Should have expertise in validating larger datasets Should have experience working with relational databases Should have experience in validation of file ingestions and data exports Should be expert in validation of the data security standards implemented in the project Should be proficient in SQL, Python, validation of ETL/ELT tools Should be proficient in validation of reports and dashboards Should be proficient in writing complex scripts to validate business logics and KPIs Should be proficient in validating data encryption, anonymization, authorization processes Extensive experience in creating test data as and when needed based on the business requirements Should be able to identify and validate the corner business use cases Prepare comprehensive test documentation including test cases, test results, and test reports. Work closely with cross-functional teams including developers, business analysts, and data architects. Suggest enhancements and implement the best practices to improve testing processes. Required Skills and Qualifications Educational Background: Bachelors degree in computer science, Information Technology, or a related field. Technical Skills: Strong understanding of ETL processes, data warehousing concepts, and SQL. Should have strong Python skills Experience: 4 to 6 years of experience in ETL testing and reports validation. Experience in automation of data validation processes Tools: Familiarity with ETL tools like ADF, DBT, etc., and defect tracking systems like JIRA. Experience with any reporting tools like Power BI, Tableau, etc Soft Skills: Excellent communication, teamwork abilities, and strong analytical and problem-solving skills. Preferred candidate profile

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Software Engineering Job Details About Salesforce Salesforce is looking for a Senior software engineer to join the Trailhead team. Trailhead is an online learning platform created by Salesforce with a big, bold mission to democratize education and skill up anyone for the future of work. The Trailhead team has immediate opportunities for talented software engineers who want to make a significant and measurable positive impact to users, the company s bottom line and the industry. Trailhead is where developers, admins, and business users get the skills they need for the jobs of the future. And thanks to gamification they have a little fun along the way. This is a rare opportunity to build something that positively impacts millions of users helping folks develop new skills and break into new careers. Feel free to explore our app, trailhead.salesforce.com , and maybe even snag a few badges (wed recommend the Introduction to Agentforce module)! Bonus points if you download the Trailhead GO app from the App Store and earn the badge on mobile! The team focuses on understanding our Trailblazers career needs and optimising their learning journey. We build solutions across product and marketing based on the full point of view of the Trailblazer to cultivate more credentialed, employable individuals in the Salesforce ecosystem. We multiply our efforts across the Trailhead marketing, engineering, content, and credentialing teams to align our strategies and change the culture to use data to make decisions. In this role, you will be work on building data pipelines, optimizing, and delivering data for core Trailhead KPIs. You will also contribute to setting the vision for and delivering the future of Trailhead core analytical funnel metrics and user behavior tracking/experiments. You will work on high impact and high visibility projects that are used by Salesforce executives. You will be encouraged to leverage and implement the latest Salesforce products and technologies. In addition, you will often be challenged to solve for ad-hoc/unstructured problems in a highly fast-paced environment and to partner with key stakeholders across teams. Equality is a core value at Salesforce. We strive to create workplaces that reflect the communities we serve and where everyone feels empowered to bring their full, authentic selves to work. People of different backgrounds, experiences, abilities, and perspectives are warmly encouraged to apply. Responsibilities Build & maintain pipelines - Develop Airflow workflows to ingest data from S3, APIs, and Kafka into Snowflake, ensuring reliability and scalability. Define data contracts & governance - Align with source teams on schemas/SLAs and enforce data classification, masking, and privacy standards. Model for analytics - Create well-structured fact/dimension tables and business measures that power self-service dashboards. Safeguard data quality & lineage - Automate tests, monitoring, and lineage tracking to surface issues early and expedite root-cause analysis. Enable collaboration & learning - Partner with analysts and data scientists, document data definitions, and share best practices across the team. About You Collaborative team player who is kind, friendly, and cares about doing the right thing Desire to keep learning and growing, both technically and otherwise, and keeping informed of new data engineering methods and techniques Ability to ask good questions and learn quickly Openness and courage to give and receive feedback Respect towards people from diverse backgrounds and commitment to upholding diversity, equity, and inclusion at work Some Qualifications We Look For B.S/M.S. in Computer Sciences or equivalent field, and 5+ years of relevant experience within big data engineering Excellent understanding of data structures and distributed data processing patterns Experience with many of the following: Implementing and operating big data technologies like Redshift, Hadoop, Spark, Presto, Hive, etc. especially in the evolving areas of security, compliance (GDPR/CCPA/Data Privacy), and data retention Cloud computing and data processing, preferably AWS, security, cluster sizing, and performance tuning ETL design and implementing pipelines in languages like Java, Scala or scripting in Python Hands on experience with Airflow, CI/CD pipelines via Jenkins or similar tools, GitHub Well versed with Snowflake/Google BigQuery/Redshift. Version control systems (Github, Stash, etc..) and deployment tools Implementing and managing Python open-source data orchestration tools such as Airflow, Pandas, etc Experience working with Web analytics platforms, metrics, and data sets (Google Analytics preferred) Plusses Salesforce experience/ certification is a plus but not required Heroku app development experience is a plus but not required Data Cloud experience is a plus but not required Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form . Posting Statement

Posted 3 weeks ago

Apply

12.0 - 18.0 years

40 - 45 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Title: Project Manager About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Roles & Responsibilities: Project / Program manager for Financial Markets. Responsible for the Programme reporting including creation and publication of Project Status Reports. Programme Data in the bank s project management platform, Clarity. Ensuring the programme data is current and accurate including the Milestones, Impacted Processes, Impacted Platforms, Stakeholders. Collaborate with the Project Managers, Scrum Masters, Delivery Leads on various initiatives as required. Team / Squad Management: Maintain the registry of Squads and Stakeholders including email distribution lists. Governance and Planning Workshops: Assist with the logistics, content / material and setup of planning workshops including the preparation of the meeting material, minutes and maintaining the artefacts. Communications: Consolidate the content from various teams, draft and review the communications. Financials: Assist the Programme Manager on Cost Management, Data Quality and Completeness. Project Administration: Work with the programme team on any ad hoc initiatives.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Mumbai

Work from Office

Assistant Manager/Deputy Manager - Data Management Department Degree in Engineering /Maths / Comp Science / Business Management/ Data Science Programme/Masters in IT/Management from a recognised institute Experience Assistant Manager/Deputy Manager (No of positions - 01): Work experience of 6-10 years in Depository, Exchanges, Mutual Funds, Clearing Corporation, Depository Participants, Banks & BFSIs Skills Preferred: Good Analytical skills to comprehend data, draw inferences & identifying data patterns. Knowledge of Data Management Tools/software used in developing and monitoring data management practices, improving security, promoting privacy and removing data duplicity. Well versed with programming languages like Python, SQL, SAS, R etc. Familiarity in using data visualization tools like Tableau, PowerBI Understanding the concepts behind Cloud Computing Job Description: Communicating and presenting the results and insights of the data analysis and modelling to various stakeholders, such as managers, clients, and decision-makers. Developing and implementing data governance frameworks, policies, standards, and best practices to ensure data quality, integrity, security, and compliance. Collaborating with business stakeholders, IT teams, and external partners to understand data needs, define data requirements, and deliver data solutions that support business objectives and growth strategies. Should have experience/exposure of software development life cycle from techno-functional standpoint including but not limited to business requirement articulation, gap analysis, impact assessment, testing and QA, implementation, support. Should have inclination to understand business operations, designing and implementing data strategies and systems. Should have the understanding of relevant applications, big data solutions and tools. Applying advanced data analytics, machine learning, and artificial intelligence techniques to generate insights, predictions, and recommendations from data. Should be able to work on data collection, storage, management, quality, protection of data. Should possess very good communication, interaction, presentation and writing skills, positive attitude. Should be self-driven and pro-active in coming up with new ideas for efficient and effective data management.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Pune

Work from Office

We are looking for a Senior Data Engineer with deep hands-on expertise in PySpark, Databricks, and distributed data architecture. This individual will play a lead role in designing, developing, and optimizing data pipelines critical to our Ratings Modernization, Corrections, and Regulatory implementation programs under PDB 2.0. The ideal candidate will thrive in fast-paced, ambiguous environments and collaborate closely with engineering, product, and governance teams. Your Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines using PySpark and Databricks . Own pipeline architecture and drive performance improvements through partitioning, indexing, and Spark optimization . Collaborate with product owners, analysts, and other engineers to gather requirements and resolve complex data issues. Perform deep analysis and optimization of SQL queries , functions, and procedures for performance and scalability. Ensure high standards of data quality and reliability via robust validation and cleansing processes. Lead efforts in Delta Lake and cloud data warehouse architecture , including best practices for data lineage and schema management. Troubleshoot and resolve production incidents and pipeline failures quickly and thoroughly. Mentor junior team members and guide best practices across the team. Your skills and experience that will help you excel Bachelors degree in Computer Science, Engineering, or a related technical field. 6+ years of experience in data engineering or related roles. Advanced proficiency in Python, PySpark, and SQL . Strong experience with Databricks , BigQuery , and modern data lakehouse design. Hands-on knowledge of Azure or GCP data services. Proven experience in performance tuning and large-scale data processing . Strong communication skills and the ability to work independently in uncertain or evolving contexts About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose to power better investment decisions. You ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Kozhikode

Work from Office

Role Summary: We are looking for a Data Engineer who will be responsible for designing and developing scalable data pipelines, managing data staging layers, and integrating multiple data sources through APIs and SQL-based systems. Youll work closely with analytics and development teams to ensure high data quality and availability. Key Responsibilities: Design, build, and maintain robust data pipelines and staging tables. Develop and optimize SQL queries for ETL processes and reporting. Integrate data from diverse APIs and external sources. Ensure data integrity, validation, and version control across systems. Collaborate with data analysts and software engineers to support analytics use cases. Automate data workflows and improve processing efficiency. Required Skills & Experience: 2-5 years of experience as a Data Engineer or in a similar data-focused role. Strong SQL skills and experience with query optimization. Hands-on experience with API integrations (REST, JSON/XML, OAuth, etc.). Familiarity with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Experience with ETL tools or frameworks (e.g., Airflow, dbt, or custom scripts). Proficiency in Python or similar scripting languages for data tasks. Understanding of data warehousing concepts and data modeling. Nice to Have: Experience with cloud data platforms (e.g., AWS, GCP, Azure). Familiarity with BI tools (Power BI, Tableau, Looker). Exposure to version control (Git) and CI/CD pipelines. What We Offer: Opportunity to build from the ground up in a high-impact startup environment. Exposure to end-to-end data solutions for diverse business domains. Flexible work culture and supportive team environment. Competitive compensation and performance-based incentives.

Posted 3 weeks ago

Apply

0.0 - 1.0 years

3 - 6 Lacs

Bengaluru

Work from Office

As part of the ML Data Engineering team, you will be contributing to advancing and managing our satellite imagery data pipeline and infrastructure. You will be collaborating closely with cross-functional teams including the machine learning team, satellite team, and data science specialists. Beyond technical expertise, your role will involve shaping data products, continuous learning, and upholding best practices. Your experience and contributions will be instrumental in driving process improvements and fostering collaboration across the organization. Responsibilities: Data Acquisition & Management: Download satellite imagery from various sources and APIs Curate and organize satellite imagery datasets according to established standards Build and maintain comprehensive data catalogs with proper metadata tagging Ensure data quality and integrity throughout the acquisition process Infrastructure & Storage: Optimize storage costs while maintaining data accessibility and performance Monitor and maintain data backup and recovery systems Data Labeling & Annotation: Coordinate and execute satellite imagery labeling initiatives with data scientists Develop and implement quality control processes for labeled datasets Work with domain experts to establish labeling guidelines and standards Manage labeling workflows and delivery to ML teams MLOps & Pipeline Support: Support the development and maintenance of automated data pipelines Implement data versioning and lineage tracking Collaborate on model training data preparation and validation Assist in deploying and monitoring data processing workflows Cross-functional Collaboration: Work closely with the satellite team to understand data requirements Support the machine learning team with timely data delivery Participate in technical discussions and provide data-driven insights Document processes and maintain technical documentation Requirements Who Should Apply We welcome final-year students, recent graduates, and early-career professionals (0\u20131 year experience) with a passion for data systems and applied ML. We encourage applicants from any academic background with the right technical skills and willingness to learn. Technical Skills: Strong programming knowledge in Python Strong problem-solving skills and debugging ability Strong programming skills in Python and familiarity with data processing libraries (pandas, numpy, etc.) Experience with cloud platforms (AWS, Google Cloud, or Azure) and storage solutions Understanding of database systems (both SQL and NoSQL) Familiarity with data pipeline tools and workflow orchestration Basic knowledge of containerization (Docker) and version control (Git) (Bonus) Understanding of satellite imagery and geospatial formats Personal Attributes: Rapid in response, flexible to changes, and nimble in approach without compromising on the overall quality of work. Develops solutions through an adequate mix of intuition, reason and logic Pushes the collective quality of thought to new limits while trusting and respecting others skillsets & intentions Communicates with utmost clarity, while maintaining the highest standards of candor Adopts simplicity as a clutter breaking mechanism to waste fewer resources and time Strives for perfection, iteratively, to deliver work that is above and beyond accepted standards of excellence Work Experience Entry-level position suitable for recent graduates or candidates with 0-1 years of relevant experience in data engineering, data science, or related fields Benefits Learning & Growth: Hands-on experience with cutting-edge satellite imagery and geospatial technologies Exposure to enterprise-scale data infrastructure and MLOps practices Mentorship from experienced data engineers and ML practitioners Opportunity to work on real-world problems with global impact Technical Exposure: Work with petabyte-scale satellite imagery datasets Learn industry best practices for data governance and quality management Gain experience with modern data stack and infrastructure-as-code Understand the complete ML lifecycle from data to deployment Career Development: \u200b Clear path for conversion to full-time role based on performance Opportunity to shape and influence data strategy and processes Exposure to cross-functional collaboration in a fast-paced environment Access to continuous learning resources and professional development

Posted 3 weeks ago

Apply

16.0 - 18.0 years

50 - 60 Lacs

Chennai, Gurugram, Bengaluru

Work from Office

Join us as a Data Engineer We re looking for someone to build effortless, digital first customer experiences to help simplify our organisation and keep our data safe and secure Day-to-day, you ll develop innovative, data-driven solutions through data pipelines, modelling and ETL design while inspiring to be commercially successful through insights If you re ready for a new challenge, and want to bring a competitive edge to your career profile by delivering streaming data ingestions, this could be the role for you Were offering this role at associate vice president level What you ll do Your daily responsibilities will include you developing a comprehensive knowledge of our data structures and metrics, advocating for change when needed for product development. You ll also provide transformation solutions and carry out complex data extractions. We ll expect you to develop a clear understanding of data platform cost levels to build cost-effective and strategic solutions. You ll also source new data by using the most appropriate tooling before integrating it into the overall solution to deliver it to our customers. You ll also be responsible for: Driving customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tools to build data solutions Participating in the data engineering community to deliver opportunities to support our strategic direction Carrying out complex data engineering tasks to build a scalable data architecture and the transformation of data to make it usable to analysts and data scientists Building advanced automation of data engineering pipelines through the removal of manual stages Leading on the planning and design of complex products and providing guidance to colleagues and the wider team when required The skills you ll need To be successful in this role, you ll have an understanding of data usage and dependencies with wider teams and the end customer. You ll also have experience of extracting value and features from large scale data. We ll expect you to have experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, exploration and analysis, and data warehousing and data modelling capabilities. You ll also need: Experience of using programming language such as Python for developing custom operators and sensors in Airflow, improving workflow capabilities and reliability Good knowledge of Kafka and Kinesis for effective real-time data processing, Scala and Spark to enhance data processing efficiency and scalability. Great communication skills with the ability to proactively engage with a range of stakeholders Hours 45 Job Posting Closing Date: 14/07/2025

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Kolkata

Work from Office

About us: Arrise Solutions (India) Pvt. Ltd. is an international services group that powers Pragmatic Play. PragmaticPlay is a global leader as a multi-product content provider to the gaming and entertainment industry. Our passion for premium entertainment is unrivalled. We strive to create the most engaging, evocative experiences for all our customers across a range of gaming products. Arrise India (Formerly known as PragmaticPlay India Pvt Ltd) is a subsidiary of Arrise Solutions Ltd., Malta. Arrise India provides software development services to its foreign entities. Our vision is to excel as a multi-product content related service provider as a technology company to the gaming and entertainment industry. We strive to create value for our clients by providing the highest quality technology and services and continually seek to improve them and to ensure consistent delivery and superior performance. Moreover, we are engaged to promote a team-oriented culture that places autonomy and trust in our employees, also to build mutually sustainable relationships defined by professionalism. Requirements for the role: The ideal candidate will be responsible for accurately and efficiently entering, updating, and maintaining data in our systems. Analyst will play a crucial role in ensuring the integrity and accuracy of our data, contributing to the overall success of our organization. Analyst will log in to the operators websites, fetch the required data from there, store the data in our excel or server files, and validate too whether the data is captured accurately or not. Data Maintaining: Input, update, and verify data accurately into designated databases, spreadsheets, and systems. Ensure data consistency, integrity, and completeness. Quality Control: Perform regular quality checks to identify and rectify discrepancies or errors in data. Work collaboratively with other team members to maintain high data quality standards. Documentation: Maintain comprehensive and organized records of all activities. Document any issues or discrepancies encountered during the data entry process. Update & maintain Trackers as per requirement. Prepare Reports as and when required. Adherence to Guidelines: Follow established guidelines and procedures for data entry and data management. Seek clarification when guidelines are unclear or require further explanation. Data Verification: Thoroughly check Operator Website for data validation. Verify accuracy of data by cross-referencing with source documents. Investigate and correct any discrepancies to ensure data consistency. Timeliness: Meet established deadlines for data maintenance tasks. Prioritize tasks to ensure timely completion of assignments. Confidentiality: Handle sensitive and confidential information with the utmost discretion and security. Communication: Communicate effectively with team members and supervisors regarding data-related issues or concerns. Provide regular updates on progress and challenges.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

JD for ADF. Role Description: Azure databricks development Competencies: Digital : Databricks, Azure Data Factory Experience (Years): 4-6 Essential Skills: Databricks, Pyspark Job Description Key Responsibilities: Design, develop, and manage ETL pipelines using Azure Data Factory and integrate data from various on-premise and cloud-based sources. Build and maintain data flows , pipelines , datasets , and linked services within ADF. Work with Azure Data Lake , Azure Blob Storage , SQL Database , and Synapse Analytics to store and process data. Optimize data pipelines for performance and cost-efficiency. Implement data transformation logic using Azure Data Flows or custom Azure Functions. Monitor, troubleshoot, and debug production ADF pipelines and related workflows. Collaborate with data architects, analysts, and business users to define data requirements and ensure data quality. Use CI/CD tools (e.g., Azure DevOps) for version control and pipeline deployment. Maintain documentation for data pipelines, integration processes, and data mapping.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary s Strong understanding & hands on experience on Collibra. Experience with designing & implementing operating model in DGC, scanning different sources with Collibra Catalog connectors, Rest API knowledge Experience in designing, developing & configuring workflows using Eclipse. Good experience in groovy scripting Experience with lineage harvesting in Collibra to track data movement and transformations across systems Good understanding & experience in developing & implementing Data Governance, Metadata Management, Data Quality frameworks, policies & processes Excellent communication & interpersonal skills, with the ability to interact effectively with senior stakeholders & crossfunctional teams Excellent analytical and problem solving skills, with the ability to address complex data governance challenges Mandatory skill sets Collibra Developer Preferred skill sets Collibra Developer Years of experience required 4 7 yrs Education qualification B.tech & MBA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Collibra Data Governance Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 3 weeks ago

Apply

6.0 - 11.0 years

5 - 8 Lacs

Pune

Work from Office

We are looking for a Senior Data Engineer with deep hands-on expertise in PySpark, Databricks, and distributed data architecture. This individual will play a lead role in designing, developing, and optimizing data pipelines critical to our Ratings Modernization, Corrections, and Regulatory implementation programs under PDB 2.0. The ideal candidate will thrive in fast-paced, ambiguous environments and collaborate closely with engineering, product, and governance teams. Your Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines using PySpark and Databricks . Own pipeline architecture and drive performance improvements through partitioning, indexing, and Spark optimization . Collaborate with product owners, analysts, and other engineers to gather requirements and resolve complex data issues. Perform deep analysis and optimization of SQL queries , functions, and procedures for performance and scalability. Ensure high standards of data quality and reliability via robust validation and cleansing processes. Lead efforts in Delta Lake and cloud data warehouse architecture , including best practices for data lineage and schema management. Troubleshoot and resolve production incidents and pipeline failures quickly and thoroughly. Mentor junior team members and guide best practices across the team. Your skills and experience that will help you excel Bachelors degree in Computer Science, Engineering, or a related technical field. 6+ years of experience in data engineering or related roles. Advanced proficiency in Python, PySpark, and SQL . Strong experience with Databricks , BigQuery , and modern data lakehouse design. Hands-on knowledge of Azure or GCP data services. Proven experience in performance tuning and large-scale data processing . Strong communication skills and the ability to work independently in uncertain or evolving contexts About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. . MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for . Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies . Note on recruitment scams

Posted 3 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Senior Analyst - Data Analysis Back to job search results Tesco India Bengaluru, Karnataka, India Hybrid Full-Time Apply by 01-Jul-2025 About the role Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBSs focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation What is in it for you Analyse complex datasets and make it consumable using visual storytelling and visualization tools such as reports and dashboards built using approved tools (Tableau, PyDash) You will be responsible for - Understands business needs and in depth understanding of Tesco processes - Builds on Tesco processes and knowledge by applying CI tools and techniques. - Responsible for completing tasks and transactions within agreed KPIs - Solves problems by analyzing solution alternatives -Engage with market leaders to understand problems to be solved, translate the business problems to analytical problems, taking ownership of specified analysis and translate the answers back to decision makers in business - Manipulating, analyzing and synthesizing large complex data sets using different sources and ensuring data quality and integrity - Think beyond the ask and develop analysis and reports that will contribute beyond basic asks - Accountable for high quality and timely completion of specified work deliverables and ad-hocs business asks - Write codes that are well detailed, structured, and compute efficient - Drive value delivery through efficiency gain by automating repeatable tasks, report creation or dashboard refresh - Collaborate with colleagues to craft, implement and measure consumption of analysis, reports and dashboards - Contribute to development of knowledge assets and reusable modules on GitHub/Wiki - Understands business needs and in depth understanding of Tesco processes - Responsible for completing tasks and transactions within agreed metrics - Experience in handling high volume, time pressured business asks and ad-hocs requests You will need 2-4 years experience preferred in analysis oriented delivery in any one of domains like retail, cpg, telecom or hospitality and for one of the following functional areas - marketing, supply chain, customer, space range and merchandising, operations, finance or digital will be preferred Strong understanding of Business Decisions, Skills to develop visualizations, self-service dashboards and reports using Tableau & Basic Statistical Concepts (Correlation Analysis and Hyp. Testing), Good Skills to analyze data using Adv Excel, Adv SQL, Hive, Phython, Data Warehousing concepts (Hadoop, Teradata), Automation using alteryx, python About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBSs focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation Apply

Posted 3 weeks ago

Apply

7.0 - 10.0 years

7 - 11 Lacs

Pune

Work from Office

Job Requirements Why work for us Alkegen brings together two of the world s leading specialty materials companies to create one new, innovation-driven leader focused on battery technologies, filtration media, and specialty insulation and sealing materials. Through global reach and breakthrough inventions, we are delivering products that enable the world to breathe easier, live greener, and go further than ever before. With over 60 manufacturing facilities with a global workforce of over 9,000 of the industry s most experienced talent, including insulation and filtration experts, Alkegen is uniquely positioned to help customers impact the environment in meaningful ways. Alkegen offers a range of dynamic career opportunities with a global reach. From production operators to engineers, technicians to specialists, sales to leadership, we are always looking for top talent ready to bring their best. Come grow with us! Key Responsibilities: Lead and manage the Data Operations team, including BI developers and ETL developers, to deliver high-quality data solutions. Oversee the design, development, and maintenance of data models, data transformation processes, and ETL pipelines. Collaborate with business stakeholders to understand their data needs and translate them into actionable data insights solutions. Ensure the efficient and reliable operation of data pipelines and data integration processes. Develop and implement best practices for data management, data quality, and data governance. Utilize SQL, Python, and Microsoft SQL Server to perform data analysis, data manipulation, and data transformation tasks. Build and deploy data insights solutions using tools such as PowerBI, Tableau, and other BI platforms. Design, create, and maintain data warehouse environments using Microsoft SQL Server and the data vault design pattern. Design, create, and maintain ETL packages using Microsoft SQL Server and SSIS. Work closely with cross-functional teams in a matrix organization to ensure alignment with business objectives and priorities. Lead and mentor team members, providing guidance and support to help them achieve their professional goals. Proactively identify opportunities for process improvements and implement solutions to enhance data operations. Communicate effectively with stakeholders at all levels, presenting data insights and recommendations in a clear and compelling manner. Implement and manage CI/CD pipelines to automate the testing, integration, and deployment of data solutions. Apply Agile methodologies and Scrum practices to ensure efficient and timely delivery of projects. Skills & Qualifications: Masters or Bachelor s degree in computer science, Data Science, Information Technology, or a related field. 7 to 10 years of experience in data modelling, data transformation, and building and managing ETL processes. Strong proficiency in SQL, Python, and Microsoft SQL Server for data manipulation and analysis. Extensive experience in building and deploying data insights solutions using BI tools such as PowerBI and Tableau. At least 2 years of experience leading BI developers or ETL developers. Experience working in a matrix organization and collaborating with cross-functional teams. Proficiency in cloud platforms such as Azure, AWS, and GCP. Familiarity with data engineering tools such as ADF, Databricks, Power Apps, Power Automate, and SSIS. Strong stakeholder management skills with the ability to communicate complex data concepts to non-technical audiences. Proactive and results-oriented, with a focus on delivering value aligned with business objectives. Knowledge of CI/CD pipelines and experience implementing them for data solutions. Experience with Agile methodologies and Scrum practices. Relevant certifications in Data Analytics, Data Architecture, Data Warehousing, and ETL are highly desirable. At Alkegen, we strive every day to help people - ALL PEOPLE - breathe easier, live greener and go further than ever before. We believe that diversity and inclusion is central to this mission and to our impact. Our diverse and inclusive culture drives our growth & innovation and we nurture it by actively embracing our differences and using our varied perspectives to solve the complex challenges facing our changing and diverse world. Employment selection and related decisions are made without regard to sex, race, ethnicity, nation of origin, religion, color, gender identity and expression, age, disability, education, opinions, culture, languages spoken, veteran s status, or any other protected class.

Posted 3 weeks ago

Apply

13.0 - 18.0 years

13 - 17 Lacs

Kochi, Thiruvananthapuram

Work from Office

"> Home / Home / Careers / Careers / Technical Project Ma... Technical Project Manager(Data) Introduction We are looking for 13+years experienced candidates for this role. Responsibilities include: Own the end-to-end delivery of data platform, AI, BI, and analytics projects, ensuring alignment with business objectives and stakeholder expectations. Develop and maintain comprehensive project plans, roadmaps, and timelines for data ingestion, transformation, governance, AI/ML models, and analytics deliverables. Lead cross-functional teams including data engineers, data scientists, BI analysts, architects, and business stakeholders to deliver high-quality, scalable solutions on time and within budget. Define, prioritize, and manage product and project backlogs covering data pipelines, data quality, governance, AI services, and BI dashboards or reporting tools. Collaborate closely with business units to capture and translate requirements into actionable user stories and acceptance criteria for data and analytics solutions. Oversee BI and analytics area including dashboard development, embedded analytics, self-service BI enablement, and ad hoc reporting capabilities. Ensure data quality, lineage, security, and compliance requirements are integrated throughout the project lifecycle, in collaboration with governance and security teams. Coordinate UAT, performance testing, and user training to ensure adoption and successful rollout of data and analytics products. Act as the primary point of contact for . This is to notify jobseekers that some fraudsters are promising jobs with Reflections Info Systems for a fee. Please note that no payment is ever sought for jobs in Reflections. We contact our candidates only through our official website or LinkedIn and all employment related mails are sent through the official HR email id. for any clarification/ alerts on this subject. Apply Now

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies