Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Product Owner for the GCP Data Migration Project at Clairvoyant, you will play a crucial role in leading the initiative and ensuring successful delivery of data migration solutions on Google Cloud Platform. With your deep understanding of cloud platforms, data migration processes, and Agile methodologies, you will collaborate with cross-functional teams to define the product vision, gather requirements, and prioritize backlogs to align with business objectives and user needs. Your key responsibilities will include defining and communicating the product vision and strategy, leading requirement gathering sessions with stakeholders, collaborating with business leaders and technical teams to gather and prioritize requirements, creating user stories and acceptance criteria, participating in sprint planning, establishing key performance indicators, identifying and mitigating risks, and fostering a culture of continuous improvement through feedback collection and iteration on product features and processes. To be successful in this role, you should have 10-12 years of experience in product management or product ownership, particularly in data migration or cloud projects. You must possess a strong understanding of Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, and Data Transfer Services, as well as experience with data migration strategies and tools including ETL processes and data integration methodologies. Proficiency in Agile methodologies, excellent analytical and problem-solving skills, strong communication skills, and a Bachelor's degree in Computer Science, Information Technology, Business, or a related field are essential qualifications. Additionally, experience with data governance and compliance in cloud environments, familiarity with project management and collaboration tools like JIRA and Confluence, understanding of data architecture and database management, and Google Cloud certifications such as Professional Cloud Architect and Professional Data Engineer are considered good to have qualifications. At Clairvoyant, we provide opportunities for engineers to develop and grow, work with a team of hardworking and dedicated peers, and offer growth and mentorship opportunities. We value diversity and encourage individuals with varying skills and qualities to apply, as we believe there might be a suitable role for you in the future. Join us in driving innovation and growth in the technology consulting and services industry!,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As an innovative, analytical, and growth-minded Lead Product Manager at CDK Global, you will take ownership of the Enterprise Data Warehouse and Governance initiatives. Your main responsibility will be to define and execute the strategy for data platforms, ensuring accuracy, accessibility, and scalability. Collaborating with engineering, business, and analytics teams, you will deliver innovative SaaS-based data solutions for seamless integration, governance, and insights for enterprise clients. Additionally, you will play a crucial role in delivering an OEM Analytics solution that enables better business decisions for OEMs and Dealers based on actionable insights and predictive analytics. Your role will involve owning customer-facing OEM Analytics and identifying market opportunities and customer pain points to grow the business. Your key responsibilities will include defining product strategy and building roadmaps for enterprise data warehouse and governance platforms, prioritizing requirements, driving execution and delivery, building new features for seamless integrations with the CDK Data Platform, overseeing the product lifecycle, and ensuring compliance with regulatory standards. You will collaborate with various teams at CDK to ensure successful go-to-market plans, conduct customer research, implement operating models, and mitigate risks associated with data governance. To be successful in this role, you should have a Bachelor's degree in business, Computer Science, Engineering, or equivalent industry experience, along with 6 to 8+ years of product management experience in Enterprise SaaS and 5+ years of experience with data governance and scaling data platforms. You should possess a strong understanding of data warehousing, ETL processes, API integration, compliance frameworks, and data governance principles. Experience with agile development methodologies, working in a product-led environment, and collaborating with globally distributed teams is essential. Qualifications also include a proven track record of delivering automated and scalable enterprise data platforms, influencing and driving strategy across multiple stakeholders, critical thinking skills, excellent communication abilities, and a data-driven mindset. Financial acumen, willingness to travel, and technical knowledge of SQL, Python, or cloud platforms are preferred qualifications. At CDK, we value inclusion and diversity to inspire meaningful connections among our people, customers, and communities. If you are authorized to work in the US and are looking to join a dynamic environment where your skills and expertise can make a real impact, we encourage you to apply.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The Chief Data & Analytics Office (CDAO) at JPMorgan Chase is responsible for accelerating the firm's data and analytics journey. This includes ensuring the quality, integrity, and security of the company's data, as well as leveraging this data to generate insights and drive decision-making. The CDAO is also responsible for developing and implementing solutions that support the firm's commercial goals by harnessing artificial intelligence and machine learning technologies to develop new products, improve productivity, and enhance risk management effectively and responsibly. Within CDAO, The Firmwide Chief Data Office (CDO) is responsible for maximizing the value and impact of data globally, in a highly governed way. It consists of several teams focused on accelerating JPMorgan Chase's data, analytics, and AI journey, including data strategy, data impact optimization, privacy, data governance, transformation, and talent. As a Senior Associate at JPMorgan Chase within the Chief Data & Analytics team, you will be responsible for working with stakeholders to define governance and tooling requirements and building out the BCBS Data Governance framework. In addition, you will be responsible for delivering tasks in detailed project plans for the BCBS deliverables owned by the Firmwide CDO. Lastly, you will play a role in developing and syndicating the content used for the BCBS governance meetings. **Job Responsibilities:** - Deliver on the BCBS book of work owned by the Firmwide CDO - Support the definition, prioritization, and resolution of governance and requirements decisions needed by the BCBS program - Collect, synthesize, analyze, and present project data and findings - Conduct analyses to identify issues and formulate recommendations - Develop regular, compelling communications on project status - Research data governance requirements and potential solutions - Collaborate effectively across organizations, functions, and geographies **Required qualifications, capabilities, and skills:** - Formal training or certification on Data Governance concepts and 3+ years applied experience - Diverse problem-solving experience - Excellent communication skills (oral and written) and the ability to work effectively in cross-functional teams - Excellent project management and organizational skills, with the ability to manage multiple deliverables and work simultaneously - Strong interpersonal leadership and influencing skills - Proficiency in MS Excel and PowerPoint **Preferred qualifications, capabilities, and skills:** - Familiarity with data management and governance, big data platforms, or data architecture is preferred - BS/BA degree or equivalent experience / Bachelor's degree in Business, Finance, Economics, or other related area,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
The Data Governance Business Analyst role involves being responsible for various tasks including assisting in identifying data quality issues, measuring and reporting, ensuring data policy adoption and compliance, and handling regulatory and audit response and action tracking. You must possess qualities such as being dynamic, flexible, and adaptable to quickly changing needs. Handling ambiguity and complexity, as well as managing multiple responsibilities, are essential. Effective communication and presentation skills are necessary to guide, influence, and convince others. You will collaborate with multiple teams to implement data issue resolution solutions and execute controls for the Data Risk and Control framework. Responsibilities also include overseeing data-related issues, conducting root cause analysis workshops, tracking ownership and target dates, and reporting metrics. Additionally, you will support business lines and global functions in requirement gathering, solution roll out, building controls around key risk indicators, and managing specific data issue resolution projects or new Key Risk Indicators implementation. The ideal candidate should have at least 10 years of relevant experience in Data Governance, Data Management, Process Engineering, or a related area. Proficiency in handling complexity, ambiguity, and a fast-changing work environment is crucial. Advanced knowledge of Project Management methodologies and tools is required, along with strong leadership, interpersonal, and relationship-building skills. Experience in working within Consulting/Audit with Big-4 firms and familiarity with Risk and Finance functions and multiple asset classes are advantageous. Education: Bachelor's/University degree, Master's degree preferred Citi is an equal opportunity and affirmative action employer.,
Posted 1 week ago
10.0 - 15.0 years
15 - 20 Lacs
Pune
Work from Office
Experience: 10-12years Availability: Immediate- 15days Role & responsibilities Own and manage Master Data Management (MDM) activities for SAP projects. De-duplication of Masters Lead data migration and cutovers in SAP S/4HANA projects (Greenfield, Migration, or Rollouts). Establish and implement MDM best practices and data management capabilities. Define data management principles, policies, and lifecycle strategies. Monitor data quality with consistent metrics and reporting. Work with MDM stakeholders to drive data governance and compliance. Track and manage MDM objects, ensuring timely delivery. Conduct training sessions for teams on ECC & S/4HANA MDM. Participate in daily stand-ups, issue tracking, and dashboard updates. Identify risks and process improvements for MDM. Required Skills & Qualifications: Minimum 10-12 years of experience in SAP MDM. Strong knowledge of ECC, SAP S/4HANA, Data Migration, and Rollouts. Experience in data governance, lifecycle management, and compliance. Familiarity with JIRA KANBAN boards, ticketing tools, and dashboards. Strong problem-solving and communication skills. Ability to work with the team especially ABAP, Middleware, Functionals. Knowledge on Excel is a MUST ABAP knowledge is preferable SAP training or certifications are an asset Team player, with strong communication skills and with a collaborative spirit Able to coach, support, train and develop junior consultants Customer oriented, result driven & focused on delivering quality
Posted 1 week ago
8.0 - 13.0 years
15 - 20 Lacs
Pune
Hybrid
EY is hiring for Leading Client for Data Governance Senior Analyst role for Pune location Role & responsibilities Coordinating with Data Srewards/Data Owners to enable identification of Critical data elements for SAP master Data Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Develop both end-state and interim-state architecture for master data, ensuring alignment with business requirements and industry best practices. Define and implement data models that align with business needs and Gather requirements for master data structures. Design scalable and maintainable data models by ensuring data creation through single source of truth Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Collaborate with the Data Governance Manager to advance the data governance agenda. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities. Collaborate with IT, data management, and business units to implement data governance best practices and migrate from ECC to S/4 MDG Monitor data governance activities, measure progress, and report on key metrics to senior management. Conduct training sessions and create awareness programs to promote data governance within the organization. Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc. Summary: SAP Master Data (Vendor, Customer, GL, Cost Center, etc.) Data Governance Implementation (Transactional & Master Data) Data Modeling & Architecture (S/4HANA, ECC) Data Cataloging, Lineage, and Quality Assessment Governance Forums & Change Advisory Boards Experience in S/4HANA Greenfield implementations Migration Experience (ECC to S/4 MDG) Preferred candidate profile 8-14 years in data governance and SAP master data Strong understanding of upstream/downstream data impacts Expert in data visualization
Posted 1 week ago
4.0 - 7.0 years
15 - 27 Lacs
Gurugram
Remote
Job Title: Data Steward Location: Remote Job Type: Fulltime Years of Experience 6-8 years About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the company’s long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them. With a client-base scoping 30 countries worldwide, Straive’s multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore. Website: https://www.straive.com/ Job Summary: We are seeking a Lead Data Steward to support enterprise data initiatives by ensuring the accuracy, completeness, and quality of data pipelines across data platforms. The ideal candidate will bring strong expertise in data quality practices, and data management principles to help establish trusted data foundations that drive business intelligence, analytics, and operational reporting. This role will be a critical part of our data team, to support data governance efforts, and enhance our data architecture. Key Responsibilities: Define, document, and maintain clear and consistent business definitions, data standards, and business rules for critical data elements within their assigned data domains. Ensure accurate and up-to-date business metadata (e.g., data definitions, ownership, lineage, quality rules) is captured and maintained in the enterprise data catalog. Design and maintain efficient ETL/ELT pipelines to ingest, transform, and deliver high-quality data across systems. Deploy the master data governance framework and use the supporting data management tools. Define data quality metrics and monitor data quality performance against established targets. Conduct regular data governance reviews and provide recommendations for process improvements. Ensure data accuracy, consistency, and integrity Implement and monitor data quality rules, validation checks, and exception handling. Collaborate with teams to align data delivery with business and reporting requirements. Document data processes, standards, and lineage to support governance and compliance. Transform raw, complex data into actionable insights through effective visualization techniques. Qualifications: 4+ years of experience in Data Governance, Operations of Data Governance Experience with data management tools (e.g., data catalogs, MDM systems, data quality platforms). Experience with data profiling and data quality assessment techniques. Solid understanding of data quality principles, data governance concepts, and master data management. Proficient in SQL and scripting for data transformation and troubleshooting. Experience with Data Governance tools such as Collibra Proactive problem-solver with a strong sense of ownership and accountability. Strong communication skills Bachelor’s degree in information systems, Computer Science, or a related technical field.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
As a Data Engineer, you will play a key role in designing, developing, and maintaining our data infrastructure and pipelines. You will collaborate closely with the rest of our Data and Analytics Engineering team and with engineering and operations teams to ensure the smooth flow and availability of high-quality data for analysis and reporting purposes. Your expertise will be essential in optimizing data workflows, ensuring data integrity, and scaling our data infrastructure to support our companys growth. This is an exceptional opportunity for someone who relishes the chance to engage with cutting-edge technology, influence the development of a world-class data ecosystem and work in a fast-paced environment on a small, high-impact team. Our core data stack makes heavy use of Snowflake and dbt Core, orchestrated in Prefect and Argo in our broader AWS-based ecosystem. Most of our wide range of data sources are loaded with Fivetran or Segment, but we use custom Python when it s the right tool for the job. What you'll do Design, develop, and maintain scalable and efficient data pipelines in an AWS environment, centered on our Snowflake instance and using Fivetran, Prefect, Argo, and dbt. Collaborate with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions. Design, build and maintain tooling that enables users and services to interact with our data platform, including CI/CD pipelines for our data lakehouse, unit/integration/validation testing frameworks for our data pipelines, and command-line tools for ad-hoc data evaluation. Identify and implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy. Optimize and tune data pipelines for improved performance, scalability, and reliability. Monitor data pipelines and proactively address any issues or bottlenecks to ensure uninterrupted data flow. Develop and maintain documentation for data pipelines, ensuring knowledge sharing and smooth onboarding of new team members. Implement data governance and security measures to ensure compliance with industry standards and regulations. Keep up to date with emerging technologies and trends in data engineering and recommend their adoption as appropriate. What will help you succeed Must-haves 3+ years as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines. Strong Python skills, especially in the context of data orchestration. Strong understanding of database management and design, including experience with Snowflake or an equivalent platform. Proficiency in SQL Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts. Experience with Argo, Prefect, Airflow, or similar data orchestration tools. Excellent problem-solving and analytical skills with a strong attention to detail. Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business. Strong communication skills. Nice-to-haves Undergraduate and/or graduate degree in math, statistics, engineering, computer science, or related technical field Experience with our stack: AWS, Snowflake, Fivetran, Argo, Prefect, dbt, and Github Actions, along with some ancillary tools Experience with DevOps practices, especially CI/CD Previous experience managing enterprise-level data pipelines and working with large datasets Experience in the energy sector Benefits: Competitive compensation based on market standards. We are working on a hybrid model with remote first policy Apart from Fixed Base Salary potential candidates are eligible for following benefits Flexible Leave Policy Office is in the heart of the city in case you need to step in for any purpose. Medical Insurance (1+5 Family Members) We provide comprehensive coverage including accident policy and life Insurance. Annual performance cycle Quarterly team engagement activities and rewards & recognitions L&D programs to foster professional growth A supportive engineering culture that values diversity, empathy, teamwork, trust, and efficiency
Posted 2 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
Context KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment. We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects. Job Description Role Objective: Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend). Location- Mumbai Years of Experience - 3-5 yrs Roles & Responsibilities: Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution. Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape. Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance. Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices. Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution. Technical Skills: Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.) Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.) SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding Soft Skills- Very good communication and presentation skills Must be able to articulate the thoughts and convince key stakeholders Should be able to guide and upskill team members Good to Have: Programming Language: Knowledge and hands-on experience with languages like Python and R. Relevant certifications related to the role .
Posted 2 weeks ago
6.0 - 8.0 years
8 - 10 Lacs
Bengaluru
Work from Office
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG s best practices policies and procedures. This role requires quick ramp up on new technologies whenever required. Bachelor s or master s degree in computer science, Information Technology, or a related field. . Role: Power BI Developer Location: Chennai Experience: 6 to 8 years Responsibilities: Data Visualization : Design, develop, and maintain interactive data visualizations and reports using Power BI. Data Modeling : Create and optimize data models to support business requirements. Data Integration : Integrate Power BI reports into other applications for enhanced business capabilities. Collaboration : Work with business stakeholders to understand their data visualization and business intelligence needs. Performance Optimization : Monitor and optimize the performance of Power BI reports and dashboards. Security : Implement row-level security on data and ensure compliance with data governance policies. Advanced Calculations : Use DAX (Data Analysis Expressions) to perform advanced calculations on data sets. Documentation : Document processes and methodologies used in developing Power BI solutions. Experience : Proven experience in data analysis, data visualization, and business intelligence. Technical Skills : Proficiency in Power BI, DAX, and data modeling. Analytical Skills : Strong analytical and problem-solving skills. Communication : Excellent communication and teamwork skills. Certifications : Relevant certifications such as Microsoft Certified: Data Analyst Associate are a plus
Posted 2 weeks ago
3.0 - 7.0 years
5 - 8 Lacs
Bengaluru
Work from Office
JD for SAP BW on Hana. Key Responsibilities: Design and implement data models using SAP BW on HANA / BW/4HANA Develop and maintain CompositeProviders, ADSOs, Open ODS Views , and InfoObjects Create ETL data flows using SAP BW ETL tools and integrate data from SAP (ECC/S/4HANA) and external systems Optimize performance of queries and data models leveraging HANA views and native HANA capabilities Work with BEx Queries and integrate with SAP Analytics Cloud (SAC) or other BI tools Implement and support real-time data replication using SLT or ODP frameworks Support data governance, data quality, and metadata management initiatives Participate in end-to-end project lifecycles: requirement gathering, design, development, testing, deployment, and support Collaborate with functional teams and business users to translate business needs into technical solutions Document technical designs, system configurations, and support procedures
Posted 2 weeks ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 2 weeks ago
3.0 - 6.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 8 Lacs
Hyderabad, Bengaluru
Work from Office
Role Overview The OCI Data Catalog PoC Specialist will be responsible for designing, executing, and documenting a Proof of Concept (PoC) for Oracle Cloud Infrastructure (OCI) Data Catalog as part of the clients broader Data Governance strategy. The specialist will demonstrate the capabilities of OCI Data Catalog, assess its fit for the clients requirements, and provide recommendations for production implementation. Key Responsibilities Lead the end-to-end delivery of the OCI Data Catalog PoC, including requirements gathering, solution design, configuration, and demonstration. Collaborate with client stakeholders to understand data governance objectives, data sources, and cataloguing needs. Configure and integrate OCI Data Catalog with relevant data sources (e.g., Oracle Autonomous Database, Object Storage, on-premises databases). Develop and execute test cases to showcase metadata harvesting, data lineage, search, classification, and data stewardship features. Integrate catalog output with Marketplace application to export and automate the metadata sharing. Document PoC outcomes, lessons learned, and recommendations for next steps. Provide knowledge transfer and training to client teams on OCI Data Catalog capabilities and usage. Troubleshoot issues and liaise with Oracle support as needed during the PoC. Required Skills & Experience 3+ years of experience in data governance, data management, or cloud data solutions. Hands-on experience with Oracle Cloud Infrastructure (OCI), especially OCI Data Catalog. Familiarity with data catalog concepts: metadata management, data lineage, data classification, and stewardship. Experience integrating data catalogs with various data sources (cloud and on-premises). Strong analytical, problem-solving, and communication skills. Ability to document technical findings and present to both technical and business stakeholders
Posted 2 weeks ago
10.0 - 15.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Skill-Retail - Omni Channel Product manager Exp-10 Yrs 15 yrs Key Responsibilities Effectively translate business strategies into product strategies, value increments, and product specifications to deliver against our core customer value propositions, and our company strategic and financial goals Help manage the creation and maintenance of user stories and business requirements for new features and enhancements leveraging multiple work streams Prioritize new feature launches based on competitive analysis, industry trends, emerging technologies, and company vision Demonstrate empathy for the customer and steer discussions to how to build customer trust Influence cross functional team(s) without formal authority Articulate clear and concise requirements for new products and features Analyze complex data sets and leverage that analysis to make data-driven product decisions Work within a matrix organization collaborating with business stakeholders, User Experience (UX) teams, engineers, and other relevant digital and technology and business teams Improve value creation by defining and aligning on KPIs to measure success Work under rapid development cycles with medium to large teams to achieve a common goal Participate in day-to-day product team activities, driving high quality customer experiences and exploring what is possible with technology Basic Qualifications Bachelors degree in IT, Computer Science, Engineering, Business, Marketing or related field OR equivalent experience. 3-4 years of Product Management experience, OR experience working with development, User Experience, Strategy, or related. Including: 1+ year of experience with direct/indirect people management preferred 1+ year of relevant experience in strategy creation, customer-focused solutioning, cross-functional leadership, or related Experience successfully leading and delivering complex data products and initiatives. Experience with data visualization tools, data warehousing, big data technologies, and/or machine learning. Experience in collaborating with cross-functional teams in a global and diverse environment. Preferred Qualifications Experience working in an omni-channel retail environment Experience in setting ambitious, tangible, and measurable team objectives and key results. Strong understanding of data governance and best practices. Experience with Google Cloud Platform. Omni Channel, Product Managers, Retail
Posted 2 weeks ago
3.0 - 6.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 2 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Gurugram
Work from Office
As the Senior Manager - Analytics at Nutrabay, you will be the strategic owner of all analytics efforts across the organization. You ll be responsible for leading the data and analytics roadmap, building scalable data infrastructure, enabling data-driven product decisions, and managing analytics tools, team, and processes. This role demands a blend of technical expertise, product mindset, business acumen, and leadership to translate data into insights and insights into action. You should apply if you have: 6- 10 years of experience in data analytics/product analytics with a strong foundation in e-commerce or product-based companies. Proven experience building analytics platforms, pipelines, dashboards, and experimentation frameworks from scratch. Hands-on expertise in Power BI , SQL , Google Analytics (GA4) , Mixpanel , Firebase , and Appsflyer . Strong understanding of data modelling , data governance , and ETL frameworks (Airflow/DBT preferred). Solid understanding of product metrics , conversion funnels , A/B testing , LTV , retention , and user segmentation . Experience working closely with engineering, product, marketing, and business leadership. Ability to manage and mentor a team of analysts, driving both execution and strategy. You should not apply if you: Haven t worked with cloud data warehouses like BigQuery or Redshift . Are unfamiliar with analytics tools like Power BI , Mixpanel , or Firebase . Do not have experience in managing cross-functional analytics projects or teams. Are uncomfortable driving business and product decisions based on analytics insights. Prefer execution-only roles with minimal strategic involvement. Skills Required: Data Visualization & BI : Power BI, Google Looker Studio SQL & Data Warehousing : BigQuery, AWS Redshift Analytics Tools : GA4, Firebase, Mixpanel, Appsflyer ETL & Data Modeling : Airflow, DBT, CDP, custom pipeline architecture Product Analytics : Funnel analysis, Retention, A/B Testing, Journey Mapping Python (Preferred for automation and advanced analytics) Data Governance & Compliance Stakeholder Communication & Data Storytelling Team Leadership & Strategic Thinking What will you do? Lead and scale the analytics function across the organization, including data engineering, BI, and product analytics. Own and drive the analytics roadmap aligned with business and product OKRs. Build a robust analytics infrastructure to support real-time decision-making. Define and implement key product and business KPIs for all departments. Work with PMs and engineers to implement event tracking systems across platforms. Run A/B tests, conversion analysis, and user behaviour research to guide product strategy. Ensure data quality, privacy, and compliance with IT and security policies. Develop self-serve dashboards and reporting for leadership and operations teams. Hire, mentor, and manage a high-performing analytics team. Work Experience: 6 - 10 years in data or product analytics with at least 3 years of experience in a leadership or team management role. Prior experience in a fast-paced, product-first, or e-commerce environment is a strong advantage. Working Days: Monday to Friday (Full-Time, Work from Office) Location: Golf Course Road, Gurugram, Haryana Perks: Opportunity to build the analytics ecosystem from the ground up. Work closely with founders and product leadership to shape company direction. High learning and personal growth opportunities. Flexible timings and an open, transparent culture. Freedom to drive experimentation, decision-making, and innovation. Why Nutrabay: We believe in an open, intellectually honest culture where everyone is given the autonomy to contribute and do their life s best work. As a part of the dynamic team at Nutrabay, you will have a chance to learn new things, solve new problems, build your competence and be a part of an innovative marketing-and-tech startup that s revolutionising the health industry. Working with Nutrabay can be fun, and a place of a unique growth opportunity. Here you will learn how to maximise the potential of your available resources. You will get the opportunity to do work that helps you master a variety of transferable skills, or skills that are relevant across roles and departments. You will be feeling appreciated and valued for the work you delivered. We are creating a unique company culture that embodies respect and honesty that will create more loyal employees than a company that simply shells out cash. We trust our employees and their voice and ask for their opinions on important business issues. About Nutrabay: Nutrabay is the largest health & nutrition store in India. Our vision is to keep growing, having a sustainable business model and continue to be the market leader in this segment by launching many innovative products. We are proud to have served over 1 million customers uptill now and our family is constantly growing. We have built a complex and high converting eCommerce system and our monthly traffic has grown to a million. We are looking to build a visionary and agile team to help fuel our growth and contribute towards further advancing the continuously evolving product. Funding: We raised $5 Million in a Series A funding.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Who we are Perch Energy is a leading community solar servicer on a mission to make renewable energy more accessible and equitable for all. Community solar breaks down the traditional barriers preventing most people from participating in the renewable energy economy. We work in numerous states across the US to bring community solar to communities and individuals who can most benefit from a more inclusive energy system. By managing the customer experience for solar farm owners, Perch is able to bring electricity bill savings to the masses, from renters and homeowners to businesses, institutions, municipalities and more by connecting them to community solar projects in their area. Perch isn t just a for-profit company, we re a for-purpose company accelerating the shift to renewables nationwide. Everyone deserves to benefit from clean energy. Everyone has a place on this Perch! What we re looking for As a Data Engineer, you will play a key role in designing, developing, and maintaining our data infrastructure and pipelines. You will collaborate closely with the rest of our Data and Analytics Engineering team and with engineering and operations teams to ensure the smooth flow and availability of high-quality data for analysis and reporting purposes. Your expertise will be essential in optimizing data workflows, ensuring data integrity, and scaling our data infrastructure to support our companys growth. This is an exceptional opportunity for someone who relishes the chance to engage with cutting-edge technology, influence the development of a world-class data ecosystem and work in a fast-paced environment on a small, high-impact team. Our core data stack makes heavy use of Snowflake and dbt Core, orchestrated in Prefect and Argo in our broader AWS-based ecosystem. Most of our wide range of data sources are loaded with Fivetran or Segment, but we use custom Python when it s the right tool for the job. What you ll do Design, develop, and maintain scalable and efficient data pipelines in an AWS environment, centered on our Snowflake instance and using Fivetran, Prefect, Argo, and dbt. Collaborate with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions. Design, build and maintain tooling that enables users and services to interact with our data platform, including CI/CD pipelines for our data lakehouse, unit/integration/validation testing frameworks for our data pipelines, and command-line tools for ad-hoc data evaluation. Identify and implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy. Optimize and tune data pipelines for improved performance, scalability, and reliability. Monitor data pipelines and proactively address any issues or bottlenecks to ensure uninterrupted data flow. Develop and maintain documentation for data pipelines, ensuring knowledge sharing and smooth onboarding of new team members. Implement data governance and security measures to ensure compliance with industry standards and regulations. Keep up to date with emerging technologies and trends in data engineering and recommend their adoption as appropriate. What will help you succeed Must-haves 3+ years as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines. Strong Python skills, especially in the context of data orchestration. Strong understanding of database management and design, including experience with Snowflake or an equivalent platform. Proficiency in SQL Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts. Experience with Argo, Prefect, Airflow, or similar data orchestration tools. Excellent problem-solving and analytical skills with a strong attention to detail. Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business. Strong communication skills. Nice-to-haves Undergraduate and/or graduate degree in math, statistics, engineering, computer science, or related technical field Experience with our stack: AWS, Snowflake, Fivetran, Argo, Prefect, dbt, and Github Actions, along with some ancillary tools Experience with DevOps practices, especially CI/CD Previous experience managing enterprise-level data pipelines and working with large datasets Experience in the energy sector Benefits: Competitive compensation based on market standards. We are working on a hybrid model with remote first policy Apart from Fixed Base Salary potential candidates are eligible for following benefits Flexible Leave Policy Office is in the heart of the city in case you need to step in for any purpose. Medical Insurance (1+5 Family Members) We provide comprehensive coverage including accident policy and life Insurance. Annual performance cycle Quarterly team engagement activities and rewards & recognitions L&D programs to foster professional growth A supportive engineering culture that values diversity, empathy, teamwork, trust, and efficiency Eliminating carbon footprints, eliminating carbon copies. Here at Perch, we cultivate diversity, celebrate individuality, and believe unique perspectives are key to our collective success in creating a clean energy future. Perch is committed to equal employment opportunities regardless of race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, protected veteran status, or any status protected by applicable federal, state, or local law. While we are currently unable to consider candidates who will require visa sponsorship, we welcome applications from all qualified candidates eligible to work in India. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request Thank you
Posted 2 weeks ago
6.0 - 11.0 years
17 - 19 Lacs
Bengaluru
Work from Office
Minimum of 6+ years of experience in IT Industry Creating data models, building data pipelines, and deploying fully operational data warehouses within Snowflake Writing and optimizing SQL queries, tuning database performance, and identifying and resolving performance bottlenecks Integrating Snowflake with other tools and platforms, including ETL/ELT processes and third party applications Implementing data governance policies, maintaining data integrity, and managing access controls Creating and maintaining technical documentation for data solutions, including data models, architecture, and processes Familiarity with cloud platforms and their integration with Snowflake Basic coding skills in languages like Python or Java can be helpful for scripting and automation Outstanding ability to communicate, both verbally and in writing Strong analytical and problem solving skills Experience in Banking domain
Posted 2 weeks ago
5.0 - 8.0 years
8 - 12 Lacs
Pune
Work from Office
Role The purpose of this role is to provide strategic guidance and recommendations on pricing of contracts being executed in the assigned SBU while maintaining the competitive advantage and profit margins. Responsible for ensuring the SoW adherence to internal guidelines of all contracts in the SBU. Do - Contract pricing review and advise - Pricing strategy deployment - Drive the deployment of pricing strategy for the SBU/ Vertical / Account in line with the overall pricing strategy for Wipro - Partner and educate the Business Leaders about adherence to the pricing strategy, internal guidelines and SoW. - Business partnering for advice on contract commercials - Work closely with pre-sales and BU leadership to review the contracts about to be finalized and provide inputs on its structuring, payment milestones and terms & conditions - Review the Resources Loading Sheet (RLS)) submitted by pre-sales / delivery team and work on the contract pricing - Collaborate with the business leaders to propose a competitive pricing basis the effort estimate by considering the cost of resources, skills availability and identified premium skills - Review adherence of contract's commercial terms and conditions - Review the commercial terms and conditions proposed in the SoW - Ensure they are aligned with internal guidelines for credit period and the existing MSAs and recommend payment milestones - Ensure accurate revenue recognition and provide forecast - Implement and drive adherence to revenue recognition guidelines - Ensure revenue recognition by the BFMs / Service Line Finance Manage are done as per the IFRS standards - Partner with Finance Managers and educate them on revenue recognition standards and internal guidelines of Wipro - Provide accurate and timely forecast of revenue for the assigned SBU/ Vertical / Cluster / Accounts - Validation of order booking - Adherence to order booking guidelines - Oversee and ensure all documents, approvals and guidelines are adhered before the order is confirmed in the books of accounts - Highlight any deviations to the internal guidelines / standards and work with the concerned teams to address the deviations - Team Management - Team Management - Clearly define the expectations for the team - Assign goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports - Guide the team members in acquiring relevant knowledge and develop their professional competence - Educate and build awareness in the team in Wipro guidelines on revenue recognition, pricing strategy, contract terms and MSA - Ensure that the Performance Nxt is followed for the entire team - Employee Satisfaction and Engagement - Lead and drive engagement initiatives for the team - Track team satisfaction scores and identify initiatives to build engagement within the team 1. Financials Monetizing Wipro's efforts and value additions Comprehensiveness of pricing recommendations Accurate inputs in forecasting of revenue as per revenue recognition guidelines 2. Internal Customer Completeness of contracts checklist before order booking 3. Team Management Team attrition %, Employee satisfaction score, localization %, gender diversity % Training and skill building of team on pricing operations Mandatory Skills: Data Governance. Experience: 5-8 Years.
Posted 2 weeks ago
0.0 - 5.0 years
7 - 12 Lacs
Mumbai
Work from Office
Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. Job Summary As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Job Responsibilities Define data quality standards Investigate data quality issues Collaborate with technology partners Establish dashboards and metrics Support data view and lineage tools Embed data quality in UAT cycles Assist Operations users with data access Work with project teams on implementations Implement data ownership processes Deliver tools and training for data owners Champion improvements to data quality Required Qualifications, Capabilities, and Skills Engage effectively across teams Understand data components for IBOR Comprehend trade lifecycle and cash management Possess technical data management skills Solve operational and technical issues Deliver with limited supervision Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills Demonstrate strong communication skills Exhibit leadership in data governance Adapt to changing project requirements Analyze complex data sets Implement innovative data solutions Foster collaboration across departments Drive continuous improvement initiatives Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. Job Summary As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Job Responsibilities Define data quality standards Investigate data quality issues Collaborate with technology partners Establish dashboards and metrics Support data view and lineage tools Embed data quality in UAT cycles Assist Operations users with data access Work with project teams on implementations Implement data ownership processes Deliver tools and training for data owners Champion improvements to data quality Required Qualifications, Capabilities, and Skills Engage effectively across teams Understand data components for IBOR Comprehend trade lifecycle and cash management Possess technical data management skills Solve operational and technical issues Deliver with limited supervision Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills Demonstrate strong communication skills Exhibit leadership in data governance Adapt to changing project requirements Analyze complex data sets Implement innovative data solutions Foster collaboration across departments Drive continuous improvement initiatives
Posted 2 weeks ago
7.0 - 8.0 years
5 - 8 Lacs
Hyderabad
Remote
Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 2 weeks ago
15.0 - 20.0 years
18 - 22 Lacs
Hyderabad
Remote
Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)
Posted 2 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 2 weeks ago
7.0 - 10.0 years
9 - 12 Lacs
Ahmedabad
Work from Office
Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France