Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
14 - 24 Lacs
Bengaluru
Hybrid
Data Governance & Quality SpecialistLocation: Bangalore (Hybrid) Experience: 38 yrs Join: Immediate30 days What Youll Do: Define and enforce data-governance policies (BCBS 239/GDPR) across credit-risk datasets Design, monitor and report on data-quality KPIs; perform profiling & root-cause analysis in SAS/SQL Collaborate with data stewards, risk teams and auditors to remediate data issues Develop governance artifacts: data-lineage maps, stewardship RACI, council presentations Must Have: 3–8 yrs in data-governance or data-quality roles (financial services) Advanced SAS for data profiling & reporting; strong SQL skills Hands-on with governance frameworks and regulatory requirements Excellent stakeholder-management and documentation abilities Nice to Have: Experience with Collibra, Informatica or Talend Exposure to credit-risk model inputs (PD/LGD/EAD) Automation via SAS macros or Python scripting If interested kindly share your resume on the mail id - simran.salhotra@portraypeople.com
Posted 3 weeks ago
3.0 - 5.0 years
9 - 13 Lacs
Mumbai, Pune, Gurugram
Work from Office
Key Responsibilities Lead support for ACV (Annual Contract Value) analysis by gathering, validating, and analyzing related data to assist in accurate financial reporting and business insights. Assist in creation and supporting maintenance of PowerBI dashboards to deliver accurate and relevant business insights. Perform thorough data cleansing, validation, and reconciliation to maintain high data quality; proactively investigate and resolve discrepancies. Work closely with finance teams to ensure accurate data flow, timely reporting, and identify opportunities to enhance reporting efficiency and quality. Skills & Experience 3 to 5 years in data analysis, finance support, or related roles. Advanced Excel skills including pivot tables, formulas, and PowerQuery. Proficient in Power BI dashboard creation and data visualization. Knowledge of ACV and familiarity with US GAAP preferred. Strong attention to detail, problem-solving ability, and effective communication.
Posted 3 weeks ago
3.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Sales Technology is a passionate team of engineers working diligently to solve a variety of customer-centric problems and challenges. We unleash the potential of other Atlassians working closely with Sales , Partner Marketing Operations team. As a team, we build intelligent sales and support systems, analytics, and smart tools which leverage a diverse landscape of Atlassian services such as RPA, Salesforce-slack integration etc As a Salesforce developer, you will provide your expertise to other Atlassians, design/implement Salesforce solutions, and implement integrations with Salesforce by leveraging MuleSoft and micro-services. This role presents an excellent opportunity for the right individual to play a vital role in helping build Go-to-Market Selling motions and Partner Experiences. A strong knowledge/background in Lead Management, Opportunity to Quote and Channel Enablement domains. Candidate will have track record of success in the defining solutions of enterprise-scale is essential. Key Responsibilities include, but not limited to: Be involved in all aspects of delivery including supporting our customer-facing community with Sales Cloud environments in Lightning. Drive, develop and maintain small to medium project deliverables. Drive standardization, process consistency, and data quality across business processes. Build and maintain effective working relationships with SalesTech Product Management team and business stakeholders. Ensure software developed adheres to best practices and quality standards through code and design reviews Work with peers to analyze technical design options and implement solutions that are efficient, scalable, and meet the acceptance criteria Qualifications 3-5 years of solution, design and development experience in building solutions on Experience Cloud/SF Partner Community. Minimum 3 year Experience working with Salesforce Lightning experience, creating custom lightning components in aura and LWC frameworks, working with SLDS and JavaScript. Full life cycle experience in solution definition and development and apex/unit testing of the Salesforce application. Deployment experience using ANT and SFDX. Migration tools / changeset / vscode / workbench for salesforce will be preferred. Analysis skills to understand the Business Problem and propose the best technical solution. Enterprise implementations of complex SFDC applications, with reports, workflow, working with several SFDC objects. Expertise in Apex, Visual force, Web Services, SOQL, SOSL, AJAX, XML, JavaScript and HTML. Develop and maintain Visualforce,
Posted 3 weeks ago
8.0 - 13.0 years
30 - 35 Lacs
Bengaluru
Work from Office
As a Data Engineer at Kinara Capital, you will play a critical role in building and maintaining the data infrastructure necessary for effective data analysis and decision-making. You will collaborate with data scientists, analysts, and other stakeholders to support data-driven initiatives. Your primary responsibilities will include designing and implementing robust data pipelines, ensuring data quality and integrity, and optimizing data storage and retrieval processes. Key Responsibilities: - Develop, construct, test, and maintain data architectures including databases and large-scale processing systems. - Create and manage data pipelines to ingest, process, and transform data from various sources. - Collaborate with data scientists and analysts to understand data needs and develop solutions to meet those needs. - Monitor data quality and implement data governance best practices. - Optimize SQL queries and improve performance of data-processing systems. - Ensure data privacy and security standards are met and maintained. - Document data processes and pipelines to facilitate knowledge sharing within the team. Skills and Tools Required: - Proficiency in programming languages such as Python, Java, or Scala. - Experience with data warehousing solutions, such as Amazon Redshift, Google BigQuery, or Snowflake. - Strong knowledge of SQL and experience with relational databases like MySQL, PostgreSQL, or Oracle. - Familiarity with big data technologies like Apache Hadoop, Apache Spark, or Apache Kafka. - Understanding of data modeling and ETL (Extract, Transform, Load) processes. - Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. - Familiarity with data visualization tools (e.g., Tableau, Power BI) is a plus. - Strong analytical and problem-solving skills, with attention to detail. - Excellent communication skills to work collaboratively with cross-functional teams. Join Kinara Capital and leverage your data engineering skills to help drive innovative solutions and empower businesses through data.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Working at Atlassian Atlassians can choose where they work - whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. ","responsibilities":" Atlassian is looking for a Data Engineer to join our Data Engineering team which is responsible for building our data lake, maintaining our big data pipelines / services and facilitating the movement of billions of messages each day. We work directly with the business stakeholders and plenty of platform and engineering teams to enable growth and retention strategies at Atlassian. We are looking for an open-minded, structured thinker who is passionate about building services that scale. On a typical day you will help our stakeholder teams ingest data faster into our data lake, you ll find ways to make our data pipelines more efficient, or even come up ideas to help instigate self-serve data engineering within the company. You ll get the opportunity to work on a AWS based data lake backed by the full suite of open source projects such as Spark and Airflow. We are a team with little legacy in our tech stack and as a result you ll spend less time paying off technical debt and more time identifying ways to make our platform better and improve our users experience. ","qualifications":" As a Data Engineer in the DE team, you will have the opportunity to apply your strong technical experience building highly reliable services on managing and orchestrating a multi-petabyte scale data lake. You enjoy working in a fast paced environment and you are able to take vague requirements and transform them into solid solutions. You are motivated by solving challenging problems, where creativity is as crucial as your ability to write code and test cases. On your first day, well expect you to have: A BS in Computer Science or equivalent experience At least 5+ years professional experience as a Software Engineer or Data Engineer Strong programming skills (Python, Java or Scala preferred) Experience writing SQL, structuring data, and data storage practices Experience with data modeling Knowledge of data warehousing concepts Experience building data pipelines, platforms Experience with Databricks, Spark, Hive, Airflow and other streaming technologies to process incredible volumes of streaming data Experience in modern software development practices (Agile, TDD, CICD) Strong focus on data quality and experience with internal/external tools/frameworks to automatically detect data issues, anomalies. A willingness to accept failure, learn and try again An open mind to try solutions that may seem crazy at first Experience working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like) Its preferred that you have: Experience building self-service tooling and platforms Built and designed Kappa architecture platforms Contributed to open source projects (Ex: Operators in Airflow) Experience with Data Build Tool (DBT) Benefits & Perks Atlassian offers a wide range of perks and benefits designed to support you, your family and to help you engage with your local community. Our offerings include health and wellbeing resources, paid volunteer days, and so much more. To learn more, visit
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Gurugram
Work from Office
About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Were seeking an experienced Data Scientist to join our growing Enterprise Data Science & AI team. In this role, you will serve as a technical expert in developing and implementing advanced machine learning/AI models and analytical solutions to improve Gartners data quality and drive business value. Youll collaborate with cross-functional teams to identify data science and AI opportunities, solve complex business problems, and drive data-driven decision making across the organization. What you ll do: Design, develop and deploy AI solutions to elevate the quality and depth of Gartners data assets, ensuring high accuracy, reliability and completeness Lead the development of various predictive and optimization use cases. Follow best practices for data science methodologies, including coding standards, model validation and model monitoring Partner with business users to understand business requirements and translate data science solutions into business action items Collaborate with IT teams to identify and design data pipelines and infrastructure that supports data science model use cases Define success metrics and KPIs for data science projects and initiatives What you ll need: Bachelors degree or higher in Data Science, Computer Science, Statistics, Information Management, or related quantitative field 3-5 years of experience in data science, with a proven track record of successfully delivering data science and AI projects in business environments Strong programming skills in Python and experience with data science libraries (sklearn, pandas, NumPy, PyTorch/TensorFlow) Strong experience with large language models (LLMs) and generative AI applications Proficiency in SQL and experience with cloud-based data platforms (AWS or Azure) Excellent communication skills with the ability to translate complex technical concepts for non-technical audiences Knowledge of software development practices (version control, CI/CD, code review) Experience with MLOps tools and best practices for model deployment and monitoring Who you are: Effective time management skills and ability to meet deadlines Excellent communications skills interacting with technical and business audience s Excellent organization, multitasking, and prioritization skills Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Intellectual curiosity, passion for technology and keeping up with new trends Delivering project work on-time within budget with high quality
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Job Summary: As a Senior Pega Engineer, you will play a crucial role in designing, developing, and implementing Pega applications that drive business efficiency and enhance customer experiences. You will be responsible for leveraging Pega s capabilities to build robust, scalable solutions that meet the needs of our organization. Your expertise will directly contribute to the success of our projects and the overall digital transformation initiatives. Responsibilities: Design and Build Pega Applications: Collaborate with business stakeholders and cross-functional teams to gather requirements and design effective Pega solutions. Develop and configure Pega applications using best practices to ensure high performance and maintainability. Implement Pega features such as case management, workflows, and user interfaces to enhance user experience. Integration and Data Management: Integrate Pega applications with external systems and databases to ensure seamless data flow and accessibility. Design and implement data models and data transformation processes to support application functionality. Ensure data quality and consistency through effective validation and cleansing processes. Performance Optimization: Monitor application performance and identify areas for improvement. Optimize Pega applications for scalability and responsiveness, ensuring they can handle increasing user loads. Troubleshoot and resolve performance issues in a timely manner. Automation and Deployment: Implement automated testing and deployment processes to streamline application delivery. Utilize tools like Jenkins or Pega s deployment manager for continuous integration and deployment (CI/CD). Maintain version control and documentation for all application changes. Security and Compliance: Apply security best practices to protect sensitive data within Pega applications. Ensure compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies. Collaborate with security teams to identify and mitigate vulnerabilities. Documentation and Knowledge Sharing: Document application architecture, design decisions, and development processes. Share knowledge with team members through documentation, training sessions, and code reviews. Mentor junior engineers and foster a culture of continuous learning within the team. Experience: Minimum of 5 years of industry experience, including at least 3 years of hands-on experience with Pega development. Strong understanding of Pega platform capabilities, mainly Pega CDH, business process management (BPM), and customer relationship management (CRM). Experience with integration techniques and tools (e.g., REST, SOAP, connectors). Education: Bachelor s degree in Computer Science, Information Systems, or a related field. Skills: Proficiency in Pega development tools and methodologies (e.g., Pega Express, Pega 8.x). Knowledge of Java for custom coding and integration tasks. Familiarity with Agile methodologies and experience working in Agile teams. Strong analytical and problem-solving skills, with attention to detail.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Working at Atlassian Atlassians can choose where they work - whether in an office, from home, or a combination of the two. That way, Atlassians have more control over supporting their family, personal goals, and other priorities. We can hire people in any country where we have a legal entity. Interviews and onboarding are conducted virtually, a part of being a distributed-first company. ","responsibilities":" Sales Technology is a passionate team of engineers working diligently to solve a variety of customer-centric problems and challenges. We unleash the potential of other Atlassians working closely with Sales , Partner & Marketing Operations team. As a team, we build intelligent sales and support systems, analytics, and smart tools which leverage a diverse landscape of Atlassian services such as RPA, Salesforce-slack integration etc As a Salesforce developer, you will provide your expertise to other Atlassians, design/implement Salesforce solutions, and implement integrations with Salesforce by leveraging MuleSoft and micro-services. This role presents an excellent opportunity for the right individual to play a vital role in helping build Go-to-Market Selling motions and Partner Experiences. A strong knowledge/background in Lead Management, Opportunity to Quote and Channel Enablement domains. Candidate will have track record of success in the defining solutions of enterprise-scale is essential. Key Responsibilities include, but not limited to: Be involved in all aspects of delivery including supporting our customer-facing community with Sales Cloud environments in Lightning. Drive, develop and maintain small to medium project deliverables. Drive standardization, process consistency, and data quality across business processes. Build and maintain effective working relationships with SalesTech Product Management team and business stakeholders. Ensure software developed adheres to best practices and quality standards through code and design reviews Work with peers to analyze technical design options and implement solutions that are efficient, scalable, and meet the acceptance criteria ","qualifications":" 3-5 years of solution, design and development experience in building solutions on Experience Cloud/SF Partner Community. Minimum 3 year Experience working with Salesforce Lightning experience, creating custom lightning components in aura and LWC frameworks, working with SLDS and JavaScript. Full life cycle experience in solution definition and development and apex/unit testing of the Salesforce application. Deployment experience using ANT and SFDX. Migration tools / changeset / vscode / workbench for salesforce will be preferred. Analysis skills to understand the Business Problem and propose the best technical solution. Enterprise implementations of complex SFDC applications, with reports, workflow, working with several SFDC objects. Expertise in Apex, Visual force, Web Services, SOQL, SOSL, AJAX, XML, JavaScript and HTML. Develop and maintain Visualforce,
Posted 3 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai, Nagpur, Thane
Work from Office
Role Brokerage is a leading global financial services firm providing a wide range of investment banking, securities, investment management and wealth management services. We advise, originate, trade, manage and distribute capital for governments, institutions and individuals. As a market leader, the talent and passion of our people is critical to our success. Together, we share a common set of values rooted in integrity, excellence and strong team ethic. We provide you a superior foundation for building a professional career where you can learn, achieve and grow. Technology is the key differentiator that ensures that we manage our global businesses and serve clients on a market-leading platform that is resilient, safe, efficient, smart, fast and flexible. Technology redefines how we do business in global, complex and dynamic financial markets. We have a large number of award winning technology platforms that help to propel our Firms businesses to be the top in the market. We have built strong techno-functional teams which partner with our offices globally taking global ownership of systems and products. We have a vibrant and diverse mix of technologists working on different technologies and functional domains. There is a large focus on innovation, inclusion, giving back to the community and sharing knowledge. Data Center of Excellence (COE) is a group within the Cyber Data Risk & Resilience Division that focuses on data as a key priority of Brokerages overall Strategy. Data CoE develops common principles for ownership, distribution and consumption of data, tooling and standards for data accessibility, a framework for governing data and help address data architecture and data quality issues for new and existing initiatives at the firm by collaborating heavily with various business units and technology functions in the firm. We are looking for an experienced Front End developer to join the Data CoE Tooling fleet as we expand and pursue a rapid delivery driven by Firmwide and Regulatory initiatives. The candidate will be expected to work at a senior level within an Agile squad, planning and implementing changes in our developing set of UI projects implemented predominantly in Angular. The developer will be expected to deliver at all stages of the software development lifecycle; gathering requirements, offering best-practice solutions to rapidly evolving goals and working closely with other fleet members to ensure deliverables are produced to time and to the highest standard. Responsibilities The successful candidate will be a highly motivated team player and a confident self-starter, with development acumen towards solving engineering problems. Key responsibilities of this role are: Developing new components and services in Angular, RxJS, Ag-grid and Material; integrating with new server-side microservices and, where required, advising on or implementing server changes Performing code reviews and guidance for other developers in the fleet; guiding other UI developers in industry best practices Building automated unit and end-to-end tests for new and existing features Actively participating in code reviews and Agile ceremonies Creating prototypes and wireframes for new features in conjunction with business users and stakeholders Required Skills Strong expertise with demonstratable work history of designing and developing modern web applications in Angular Expert level JavaScript/TypeScript knowledge in a cross-browser environment Strong expertise with reactive web development using RxJS Knowledge of Ag-Grid Enterprise features and styling/testing concerns Use of component/styling libraries e.g. Material and visualization/graphing libraries; D3 Ability to create wireframes and prototypes from complex requirements in order to iterate prototype designs with stakeholders (Balsamiq/Figma) Proficiency in writing unit tests with Karma and end-to-end tests using Cypress/Cucumber Strong technical analysis and problem-solving skills Strong communicator Proficiency in Git, Bitbucket, CI/CD pipelines, build tooling Desired Skills Previous IB background Expertise in server-side development (Java/Spring frameworks) Knowledge of ngrx or similar Experience of server-side development using Node Experience with designing RESTful Web Services/microservices Creation/design of dashboards in Tableau or similar
Posted 3 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Pune
Work from Office
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. Data Streaming Engineer: - Experience: 4+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.
Posted 3 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Pune
Work from Office
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. . Data Streaming Engineer: - Experience: 4+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Azure Fabric Good to have skills Azure Synapse, Databricks, or Power BI Service. Knowledge of CI/CD pipelines using Azure DevOps or GitHub. Summary: Skilled in design, implementation, and managing data solutions using the Microsoft Fabric platform, integrating data engineering, data science, real-time analytics, and business intelligence capabilities within a single SaaS experience. Should have a strong foundation in data modeling, data pipeline creation, and visualization using Power BI, with hands-on expertise in OneLake, Lakehouses, Dataflows Gen2, and Notebooks. Key Responsibilities: Design and build modern data solutions using Microsoft Fabric components: Lakehouse, Data Warehouse, Data Pipelines, Notebooks, and Power BI. Ingest and transform data from various sources into OneLake for centralized, governed storage. Develop Dataflows Gen2 and Pipelines for ELT/ETL operations using Fabrics native tools. Build semantic models and Power BI reports on top of Fabric datasets. Collaborate with stakeholders to translate business requirements into data solutions using Fabric s unified architecture. Work across Fabric workloads including Data Factory, Data Engineering (Spark), and Data Science. Optimize performance, ensure data quality, and implement security and compliance standards. Stay updated with Microsoft Fabric roadmap and emerging features. Required Skills & Experience: Hands-on experience in data engineering or BI development. Hands-on experience with Microsoft Fabric or preview environments. Strong understanding of Lakehouse architecture and OneLake. Proficient in Power BI, DAX, Power Query (M), and data modeling. Experience with Apache Spark notebooks (PySpark or Scala). Familiarity with SQL-based Data Warehousing within Fabric. Knowledge of data governance, data lineage, and role-based access control. Preferred Skills: Experience with Azure Synapse, Databricks, or Power BI Service. Knowledge of CI/CD pipelines using Azure DevOps or GitHub. Exposure to ML workloads and Data Science tooling within Fabric.
Posted 3 weeks ago
0.0 - 3.0 years
2 - 5 Lacs
Chennai
Work from Office
FE Fundinfo is a leading financial data provider, connecting the investment industry across the UK, Europe, and Asia-Pacific through an integrated platform. Our skilled team empowers clients with data-driven insights, making the industry Navigate Complexity with Confidence. We re looking for a Quality Analyst to join our Chennai office, where you will be responsible for conducting data quality activities such as identifying, comparing, and evaluating large datasets to ensure accuracy and consistency. This role offers hands-on experience with financial data, client interaction, and cross-functional collaboration. You ll play a key part in ensuring data accuracy and supporting client operations, with opportunities to grow in a dynamic and supportive environment. Your key responsibilities as a Quality Analyst will include: Perform regular data quality checks and resolving inconsistencies. Manage client onboarding and ensure accurate data integration. Handle client queries and maintain service standards. Overseeing data validation and precision sampling schedules. Support client portfolio updates and report it. Collaborate with internal teams to resolve data issues. You will need the following experience and skills to join us as a Quality Analyst: You must have a bachelor s degree, preferably in finance. You should have strong analytical and problem-solving skills. You must be detail-oriented with high data accuracy. Proficiency in Microsoft Excel and Office tools. Excellent communication skills. By joining the team, you will be offered the following: 24 days holiday Paid Study leave Enhanced paternity & maternity Statutory benefits like PF, Gratuity, etc Support to set up home office Health cover with option to add family members Annual health check up Meal cards Full LinkedIn Learning access
Posted 3 weeks ago
8.0 - 15.0 years
7 - 11 Lacs
Hyderabad
Work from Office
1. Stakeholder Collaboration & Business Engagement Lead discussions with business and technical stakeholders (Procurement, P2P Finance, Supply Chain) to gather data quality and business rule requirements. Engage in architecture discussions for the Data Quality framework. Provide detailed business requirements to BODS/LSMW teams for enabling accurate data loads. Review data quality rule exceptions, incorporate logic enhancements, and secure stakeholder sign-offs at key stages. Identify continuous improvement opportunities in data governance and maintenance processes. 2. Data Quality Rules & Standards Definition Define and document business-driven data quality rules aligned with functional specs and compliance standards. Translate business rules into technical logic for development and validation in Data Quality tools. Support development and testing of rules in data quality platforms; validate and communicate results to stakeholders. 3. Data Profiling, Cleansing & Monitoring Analyze supplier data to identify cleansing opportunities through profiling reports. Support creation of dashboards/reports to track data quality metrics, errors, and improvements. Conduct root cause analysis on data issues and recommend process/system control improvements. Execute data cleansing tasks, including pre- and post-validation activities within migration projects. 4. Supplier/Vendor Master Data Management (SAP) Maintain and manage vendor master data in SAP (when BODS is not applicable). Ensure data integrity accuracy, completeness, and consistency across all vendor records. Lead initiatives on obsolete data management, including defining deactivation and retention criteria. 5. Data Migration, Integration & Tool Support Collaborate with IT teams during SAP projects for vendor data migration, cleansing, and validation. Translate business issues into actionable data quality or migration plans with a tool-driven approach. Recommend and support system/process enhancements to strengthen master data governance. Sap Functional Consultant, Sap Vendor Master, Vendor Managament, Master Data Management (Mdm), Supplier Management
Posted 3 weeks ago
8.0 - 15.0 years
7 - 11 Lacs
Hyderabad
Work from Office
1. Stakeholder Collaboration & Business Engagement Engage with business users from Supply Chain, Manufacturing, and Quality to gather and document data requirements. Participate in Data Quality framework discussions with architecture and governance teams. Provide clear business input to BODS/LSMW teams to support accurate data loads and integrations. Monitor data quality rule exceptions, incorporate improvements, and secure timely stakeholder sign-offs. Identify continuous improvement opportunities in data standards, rules, and maintenance processes. 2. Data Quality Rules & Standards Definition Define, document, and enforce material master data quality rules aligned with business needs and compliance standards. Translate functional business rules into technical logic for implementation in data quality tools. Assist in rule development, testing, and validation, ensuring business understanding and approval of outputs. 3. Data Profiling, Cleansing & Monitoring Develop dashboards and reports to monitor data quality metrics and issues. Review profiling results for material master data to identify cleansing or enrichment needs. Support the execution of a robust Data Quality Framework including root cause analysis and improvement recommendations. Conduct pre- and post-validation checks during data cleansing initiatives. 4. Material Master Data Management (SAP) Manage creation, maintenance, and validation of SAP material master records (especially where BODS is not used). Ensure consistency, accuracy, and completeness of material data across systems. Lead reviews of obsolete materials and define deactivation and governance processes. 5. Data Migration, Integration & Tool Support Collaborate with IT and functional teams on SAP data migration and enhancement initiatives. Translate business challenges into actionable technical solutions for better data quality and governance. Recommend enhancements to processes and tools supporting material master data governance and compliance. Material Master Data, Master Data Management (Mdm), Sap Functional Consultant, Sap Mm
Posted 3 weeks ago
0.0 - 1.0 years
0 Lacs
Bengaluru
Work from Office
About the Role: We are looking for a highly skilled Analytics/AI Engineer to bridge the gap between data engineering and data analysis. You will play a critical role in building scalable data pipelines, transforming raw data into actionable insights, and enabling data-driven decision-making across the organization. Key Responsibilities: Design and implement robust, scalable data pipelines using PySpark , Python, Polars and Gen-AI . Develop data models and transformation logic to support analytics and business intelligence needs. Leverage Python and Object-Oriented Programming (OOPs) principles to build reusable and maintainable data tools and workflows. Utilize Databricks and cloud-based platforms to process and manage large datasets efficiently. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver clean, trusted data. Ensure high data quality and consistency across different systems and reports. Must-Have Skills: Strong knowledge of Python programming . Good understanding of Object-Oriented Programming (OOPs) concepts. Advanced SQL skills for data manipulation and querying. Experience with PySpark , Polars for distributed and in-memory data processing. Ability to work independently in a fast-paced , agile environment. Problem-solving mindset with eagerness to learn and experiment. Good communication and collaboration skills. Nice to Have: Understanding of data architecture, ETL/ELT processes, and data modelling. Knowledge on Databricks and Azure Cloud Environment. Good to have an understanding on Data Structures and Algorithms. Familiarity with best coding practices, CI/CD & version control (Git).
Posted 3 weeks ago
3.0 - 6.0 years
13 - 15 Lacs
Hyderabad
Work from Office
Data Architecture & Modeling: Design and build scalable data pipelines and systems. Develop data models that support analytics and reporting. ETL Development: Extract data from various sources, transform it into usable formats, and load it into data storage systems (ETL). Ensure data quality, integrity, and consistency. Database Management: Manage and optimize databases (SQL/NoSQL). Implement data partitioning, indexing, and tuning for performance. Data Integration: Integrate data from APIs, third-party services, cloud platforms, and internal systems. Work with structured and unstructured data. Automation & Orchestration: Use tools like Apache Airflow, AWS Glue, or Azure Data Factory to schedule and manage workflows. Collaboration: Work with Data Scientists, Analysts, and Business Teams to understand data needs. Ensure the data infrastructure supports business intelligence and machine learning. What You Know: DataBricks , GCP Big query (Need to be good with SQL ) , Python , Familiar with Data science concepts or implementations Education: Bachelor s degree in computer science, Information Systems, Engineering, Computer Applications, or related field. Benefits: In addition to competitive salaries and benefits packages, Nisum India offers its employees some unique and fun extras: Continuous Learning - Year-round training sessions are offered as part of skill enhancement certifications sponsored by the company on an as need basis. We support our team to excel in their field. Parental Medical Insurance - Nisum believes our team is the heart of our business and we want to make sure to take care of the heart of theirs. We offer opt-in parental medical insurance in addition to our medical benefits. Activities -From the Nisum Premier Leagues cricket tournaments to hosted Hack-a-thon, Nisum employees can participate in a variety of team building activities such as skits, dances performance in addition to festival celebrations. Free Meals - Free snacks and dinner is provided on a daily basis, in addition to subsidized lunch.
Posted 3 weeks ago
3.0 - 7.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: DQ Programme Analyst About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Description: Role: DQ Programme Analyst Location : Bangalore/ Pune / Hyderabad/ Chennai Responsibilities: Strong change and project management skills Stakeholder Management, Communications, Reporting Data management, data governance, and data quality management domain knowledge Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance, Data Quality Measurement and Reporting, Data Quality Issues Management. Liaise with IWPB markets and stakeholders to coordinate delivery of organizational DQ Governance objectives, and provide consultative support to facilitate progress Conduct analysis of IWPB DQ portfolio to identify thematic trends and insights, to effectively advise stakeholders in managing their respective domains Proficiency in MI reporting and visualization is strongly preferred Proficiency in Change and Project Management is strongly preferred. Ability to prepare programme update materials and present for senior stakeholders, with prompt response any issues / escalations Strong communications and Stakeholder Management skills: Should be able to work effectively and maintain strong working relationships as an integral part of a larger team 8+ yrs of relevant experience preferred.
Posted 3 weeks ago
0.0 - 2.0 years
2 - 4 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Description: Image, Video and text annotators contribute to the development of AI and machine learning systems by providing accurately labeled datasets, enabling the training and evaluation of algorithms for various applications such as object detection, image classification, semantic segmentation, and more. Should have two years of experience in labeling or annotating various elements within images/videos or visual data Should have annotated temporal information within videos, such as tracking the movement or trajectory of objects, identifying key frames, or annotating specific actions or events occurring over time. Should have annotated images by marking and labeling specific objects, regions, or features within the images. This may involve drawing bounding boxes around objects, highlighting points of interest, or segmenting objects using polygons or masks Ensure the quality and accuracy of annotated data by reviewing and verifying annotations for correctness and consistency Follow and adhere to specific annotation guidelines, instructions, or specifications provided by the project or company. This includes understanding and applying domain-specific knowledge, as well as adapting to evolving requirements. Collaborate with team members, project managers, or stakeholders to address any questions, concerns, or issues related to the annotation process. Effective communication is essential for ensuring consistent and accurate annotations. Experience working on complex annotation tasks including 3D lidar labeling. Excellent written and verbal communication skills to convey technical challenges and updates. Report any data quality issues or tool-related challenges to the project leads in a timely manner. Additional Sills:
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
We are looking for the right person to add to our growing enterprise data team. We are seeking a dynamic Data Engineer with a specialized focus on Business Intelligence and a profound understanding of financial concepts. This role is pivotal in bridging the gap between our technical and business teams, ensuring that financial metrics like recurring revenue, adjustments, renewals, and retention are accurately represented and easily understood across the organization. You will be instrumental in driving data-driven decision-making by translating complex financial data into actionable insights. Location : Hyderabad(Remote) Shift Timings : 5:00 pm IST (6:30 am EST/7:30 am EDT) - 2:00 am IST (3:30 pm EST/4:30 pm EDT) Job Responsibilities Collaborates with the business to develop, debug, and maintain a comprehensive financial reporting suite. Requires deep fluency in financial concepts to ensure reporting accuracy and relevance. Aligns initiatives between business teams and technical teams to refine data models that feed into business intelligence tools. Acts as a fluent middleman between these groups, translating complex financial concepts into actionable insights for data-driven decision-making across the organization. Implements and innovates processes and systems to monitor data quality, ensuring production data is accurate, timely, and available for key stakeholders. Continually contributes to and enhances data team documentation, focusing on clarity and the explanation of financial metrics and models. Performs complex data analysis to troubleshoot and resolve data-related issues. Utilizes understanding of financial metrics to provide insights that drive business strategy. Works closely with a cross-functional team including frontend and backend engineers, product managers, and analysts. This role involves explaining financial concepts in common terms to ensure all team members understand the impact of data on business outcomes. Defines and manages company data assets, artifacts, and data models, ensuring they reflect current financial terminologies and practices. Required Qualifications and Skills 5 Years of Data Engineering Experience with a focus on Business Intelligence 5 Years in Financial Reporting, with a strong grasp of financial concepts 5 Years of Experience
Posted 3 weeks ago
2.0 - 7.0 years
6 - 9 Lacs
Pune
Work from Office
About the Team Data is at the foundation of DoorDash success. The Data Engineering team builds database solutions for various use cases including reporting, product analytics, marketing optimization and financial reporting. By implementing dashboards, data structures, and data warehouse architecture; this team serves as the foundation for decision-making at DoorDash. About the Role DoorDash is looking for a BI Engineer to build and scale data models, pipelines, and self-service analytics across the Finance, Legal and Public Relations teams data needs. In this role, you ll focus on developing and enhancing ETL pipelines, delivering in-depth insights and building reports that meet our growing business needs, enabling teams to access and analyze data independently.This will be a hybrid position. Youre excited about this opportunity because you will Work with business partners and stakeholders to understand wide range of data requirements Work with engineering, product teams and 3rd parties to collect required data Design, develop and implement large scale, high volume, high performance data models and pipelines Develop and implement data quality checks, conduct QA and implement monitoring routines Improve the reliability and scalability of our ETL processes Manage a portfolio of data products that deliver high-quality, trustworthy data Help onboard and support other engineers as they join the team Were excited about you because 2+ years of professional experience, excluding internships 2+ years experience working in Business Intelligence, Data engineering, or a similar role 2+ years of experience in building reporting and dashboarding solutions using tools such as Tableau, Sigma, Looker, Superset Proficient in Python Expert in Database fundamentals, SQL, and performance tuning Experience building reporting and dashboarding solutions using Snowflake or similar ecosystem Experience working with Snowflake,Trino,Databricks, PostgreSQL and/or other DBMS platforms Excellent communication and documentation skills and experience working with technical and non-technical teams Comfortable working in fast paced environment, self starter and self organizing Ability to think strategically, analyze and interpret market and consumer information You are located in or are planning to relocate to Pune, India Nice to Haves: Knowledge of programming languages such as Python About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. Were committed to supporting employees happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We re committed to growing and empowering a more inclusive community within our company, industry, and cities. That s why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. Were committed to supporting employees happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We re committed to growing and empowering a more inclusive community within our company, industry, and cities. That s why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. We use Covey as part of our hiring and/or promotional process for jobs in certain locations. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: https: / / getcovey.com / nyc-local-law-144 To request a reasonable accommodation under applicable law or alternate selection process, please inform your recruiting contact upon initial connection.
Posted 3 weeks ago
5.0 - 10.0 years
9 - 13 Lacs
Noida
Work from Office
Exp 5+ We need to look for someone who have 5+ years of hands on exp. If candidate have other knowledge like Databricks will be added advantage The Talend developer role is responsible to designing, developing and maintaining scalable ETL solutions using Talend platform in order to improve data quality of our CRM eco-system applications and reduce manual data processing. This role is key part of the HARMONIA project team while the engagement is active. It will fully support the Talend developments and software testing as main contribution to the team. The role will also be part of the Harmonia Data Quality project and Data Operations scrum teams. It will contribute to additional activities such as unit testing, integration/business user testing and operational support during the engagement. The position holder must organize and plan her/his work under special consideration of a frictionless information flow in the Digital Solutions and the relevant business department. He/she must guarantee an overall excellent co-operation with all Digital members, business representatives and external experts if applicable. Job Requirements Details Responsible for development of Talend jobs and configuration Responsible for delivering tested, validated deployable jobs to production environments by following Talend best practices and JIRA development framework development practices. Translate business requirements into efficient and scalable Talend solutions and assist Solutions Architect with input/feedback for those requirements wherever deemed necessary. These are to be done by actively participating brainstorming sessions arranged by the Project Manager. Work closely with the Manager of Data Operations and Quality, project manager, business analysts, data analysts, Talend Solutions Architect, other developers and other subject matter experts to align technical solutions with operational needs. Ensure alignment with data governance, security, and compliance standards. Responsible for ensuring that newly produced jobs follow standard styles which are already part of the current Talend jobs & flows developed by the current integrator, and INSEAD teams. Apply best practices in error handling, job orchestration, performance tuning, and logging. Reuse components and templates to drive consistency and maintainability across integration processes. Monitor production Talend jobs and respond to incidents or failures to ensure operational continuity. Collaborate with SQA and data governance teams to support data validation, cleansing, and quality improvement efforts. Contribute to sprint planning and agile ceremonies with the Harmonia Project and Data Operations teams. Document ETL logic, data mappings, job configurations, and scheduling dependencies. Perform unit testing and support user acceptance testing (UAT) activities. Actively participate to the project related activities and ensure the SDLC process is followed. No budget responsibility #LI-AS2 Pay Range Based on Experience
Posted 3 weeks ago
2.0 - 3.0 years
4 - 5 Lacs
Noida
Work from Office
Having at least 4-6 of years salesforce development experience. Strong understanding of Salesforce platform and its functionalities. Must have 2 to 3 years of strong working experience in Flows. Have Strong experience for all types of Salesforce flows, debugging Salesforce flows involving testing and troubleshooting and verifying the functionality of flow logic and accuracy of flows. Configuring security settings, managing roles, profiles, and permission sets, and implementing sharing rules. Ensuring data quality and integrity All aspects of user and license management, including new user setup/deactivation, roles, profiles, permissions, public groups Excellent communication and interpersonal skills. Problem-solving and analytical skills. Experience cloud may be a good skill to add
Posted 3 weeks ago
9.0 - 14.0 years
15 - 19 Lacs
Gurugram
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Manager & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. & Summary Sr Manager_Data Science s Key Responsibilities Collaborate with clients to understand their business needs and provide datadriven solutions. Develop and implement machine learning models to solve complex business problems. Analyze large datasets to extract actionable insights and drive decisionmaking. Present findings and recommendations to stakeholders in a clear and concise manner. Stay updated with the latest trends and advancements in data science and machine learning. Technical Skills Programming Languages Proficiency in Python, R, and SQL for data manipulation, analysis, and model development. Machine Learning Frameworks Extensive experience with TensorFlow, PyTorch, and Scikitlearn for building and deploying models. Data Visualization Tools Strong knowledge of Tableau, Power BI, and Matplotlib to create insightful visualizations. Cloud Platforms Expertise in AWS, Azure, and Google Cloud for scalable and efficient data solutions. Database Management Proficiency in SQL and NoSQL databases for data storage and retrieval. Version Control Experience with Git for collaborative development and code management. APIs and Web Services Ability to integrate and utilize APIs for data access and model deployment. Machine Learning algorithms Supervised and Unsupervised Learning Regression Analysis Classification Techniques Clustering Algorithms Natural Language Processing (NLP) Time Series Analysis Deep Learning Reinforcement Learning ValueAdded Experience Generative AI (GenAI) experience, including working with models like GPT, BERT, and other transformerbased architectures Ability to leverage GenAI for tasks such as text generation, summarization, and conversational AI Experience in developing and deploying GenAI solutions to enhance business processes and customer experiences Qualifications Bachelors or Masters degree in Data Science, Computer Science, Statistics, or a related field 9+ years of relevant experience in data science and machine learning Strong analytical and problemsolving skills Excellent communication and presentation abilities Ability to work independently and as part of a team Mandatory skill sets Data Science Preferred skill sets Data Science Years of experience required 9+ Education qualification BE/BTech/MBA/MCA Education Degrees/Field of Study required Bachelor of Technology, Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Data Science Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 26 more} No
Posted 3 weeks ago
3.0 - 8.0 years
2 - 3 Lacs
Mumbai
Work from Office
Primary Job Function: Coordinate and oversee the labelling process (Label Change Request LCR) across multiples departments including Labelling Team, Supply Chain, internal and external plants and Local Regulatory Affairs. Ensure compliance with regulatory and quality requirements/standards, manage manufacturing and logistic constraints and implement timely packaging material changes to minimize costs and reduce write-offs. Core Job Responsibilities: Coordinate Labelling Activities for New Product Introduction working in collaboration with Launch Managers to ensure timely product launches Ensure on time execution of Label Change Requests (LCR) by aligning regulatory and quality requirements, technical specifications, and implementation dates. Guarantee that LCR implementation does not adversely affect other affiliates sharing the same products or packaging. Lead effort to swiftly analyze and resolve bottlenecks in the labelling process, facilitating effective communication and collaboration among stakeholders to ensure smooth and efficient operations. Conduct regular performance s reviews with stakeholders to evaluate KPIs, monitor priorities and identify continuous opportunities for continuous improvement in the labeling process. Ensure quality and regulatory compliance of labelling activities and processes in accordance with departmental procedures and applicable Abbott policies. Manage documentation by creating and maintaining Work Instructions and Standard Operating Procedures related to the labelling process. Develop and provide training for Labelling Team personnel and other stakeholders involved in the labelling processes Supervisory/Management Responsibilities: Direct Reports: - Indirect Reports: - Minimum Education: A minimum of bachelor s degree is required, preferably in scientific or business discipline egree or equivalent. Minimum Experience/Training Required: Minimum of 3 years experience in Life Sciences business (Operations, Regulatory, Quality) Proficiency in business systems and tools including Artwork Management System, Enterprise Resources Planning, Project & Portfolio Management solutions. Demonstrated expertise and training in GMP/GxP standards for pharmaceutical products Desired skills/experiences include: Business exposure to international markets Excellent verbal and written communication skills in English including presentation skills. Ability to work effectively within complex organization and collaborate with diverse stakeholders. Strong discipline and stress resilience, with proven experience in managing and delivering multiple tasks on time. High accuracy and attention to detail, with an understanding of the consequences of poor data quality. Strong project management skills, including issue identification, problem analysis and solution development. JOB FAMILY: Engineering LOCATION: India > Mumbai : BKC Building t SIGNIFICANT WORK ACTIVITIES: Continuous sitting for prolonged periods (more than 2 consecutive hours in an 8 hour day)
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France