Home
Jobs

3333 Informatica Jobs - Page 18

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Your role will be pivotal in driving innovation and efficiency within the team, while also maintaining open lines of communication with stakeholders to keep them informed of progress and developments. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of applications.- Familiarity with data warehousing concepts and methodologies.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW/4HANA Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful project outcomes. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring adherence to best practices in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA.- Strong understanding of data modeling and ETL processes.- Experience with SAP HANA database and its functionalities.- Familiarity with reporting tools and dashboard creation.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW/4HANA Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA.- Strong understanding of data modeling and data warehousing concepts.- Experience with SAP BusinessObjects and reporting tools.- Familiarity with ETL processes and data integration techniques.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark Good to have skills : AWS GlueMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. You will also monitor and optimize existing data processes to enhance performance and reliability, making data accessible and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with data architects and analysts to design data models that meet business needs.- Develop and maintain documentation for data processes and workflows to ensure clarity and compliance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with AWS Glue.- Strong understanding of data processing frameworks and methodologies.- Experience in building and optimizing data pipelines for performance and scalability.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and industry standards, facilitating seamless data integration and accessibility across the organization. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and design processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with data governance frameworks.- Strong understanding of relational and non-relational database systems.- Familiarity with data warehousing concepts and ETL processes.- Experience in using data modeling tools such as Erwin or IBM InfoSphere Data Architect. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Pune.- A 15 years full time education is required.-Must have skills-Snowflake Data Vault 2.0 Modeler Qualification 15 years full time education

Posted 6 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 6 days ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.

Posted 6 days ago

Apply

0 years

0 Lacs

India

Remote

ABOUT TIDE At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. ABOUT THE ROLE: Working closely with business (Product, Marketing, Legal/Privacy, Risk & Compliance) and technology (Data Engineering, Analytics Engineering, Software Developers) teams across the whole organisation to understand challenges and opportunities related to data Building awareness and manage communication to help Tide to sparkle as a data driven organisation Leading implementation of Data Governance Policy and Framework Defining data governance implementation roadmap (with a specific focus on critical data management, data quality and security) aligned to Tide’s OKRs and strategic initiatives e.g. expansion to other countries, new products, data mesh Facilitation of working groups, workshops and trainings aimed at delivery of various initiatives and educating Tideans about data governance best practices Capturing and facilitating resolution of data quality issues Defining data quality rules to allow automated measurement and reporting Contributing into creation of policies, standards and procedures Implementing Atlan as a strategic metadata tool (data glossary, data catalogue, lineage, classification). Working with the vendor to define and implement new features and resolve issues. Cooperating with Data Stewards to define data glossary, data classification and integrate data sources in order to enable value streams. This is to help data consumers to be more efficient in identifying and using data for their use cases. Acting as the de facto “product owner” for technical tools and product features that facilitate compliance with data regulations such as GDPR overseeing their implementation and maintenance. WHAT WE ARE LOOKING FOR: You are self starter able to work with minimal supervision and translate high level objectives into actionable tasks You are very good communicator and easily build rapport with senior stakeholders as well as data engineers You have passion to work with other people to define new and improve existing standards and processes You have to understand data governance concepts, know how to apply them to address firm’s challenges and have experience in practical implementation You are familiar with Data Catalogues and Data Quality/Observability tools, for example: Acceldata, Alation, Atlan, Elementary, Collibra, Informatica, Monte Carlo, Sifflet, Solidatus or similar You have previous experience in the financial services industry. You have a collaborative style and agile mindset You have familiarity with the technical implementation of data privacy laws, compliance, and risk management frameworks. You know basics of SQL to query databases (SELECT, JOIN, GROUP BY) and have experience in building visualisations using BI Tools such as Looker, Power BI, Tableau or similar WHAT YOU'LL GET IN RETURN: Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options TIDEAN WAYS OF WORKING At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard.

Posted 6 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Hi Everyone, we have immediate onboarding for the below given Job description. 4–6 years of SAP FICO consulting experience with proven involvement in finance master-data projects. Participation in at least one full-cycle SAP MDM/MDG implementation focused on finance objects. Prior exposure to healthcare or highly regulated industries preferred. Certifications SAP FICO or MDG certification desirable. SAP Activate or Agile/Scrum certification advantageous. Technical Skills In-depth configuration knowledge of SAP FICO (GL, AP, AR, AA, CO-CCA/PCA). Proficiency in SAP MDM/MDG data modelling, governance workflows, and data-quality rules. Experience with SAP BODS, LTMC, LSMW, and SQL/Excel for data analysis. Familiarity with data-profiling tools such as SAP Information Steward or Informatica DQ.

Posted 6 days ago

Apply

2.0 - 5.0 years

5 - 8 Lacs

Madhwapur

Work from Office

Roles and Responsibility Collaborate with cross-functional teams to design, develop, and deploy Denodo solutions. Develop high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in Denodo development. Contribute to the development of new features and functionalities. Job Requirements Proficiency in Denodo development is mandatory. Strong understanding of software development principles and methodologies. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with agile development methodologies and version control systems.

Posted 6 days ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Highlight of the engagement opportunity Nature of role: Full time Number of years of experience expected: 5 to 10 years. Areas of past experience preferred: ETL documentation / Data mapping / ETL development / Python / Risk data / Finance data Educational qualification required: BTech / Product courses / MBA/ Data science courses. Additional certifications preferred: Results-oriented with a track record of successfully delivering complex data projects. Preferred geography of previous work experience: India / Europe / APAC / US Language requirements: Ability to write and speak fluently in English. Technology proficiency preferred: MSSQL, PostgreSQL, Python, Informatica, SSIS, cloud proficiency, SAS Data Integration Studio, Informatica PowerCenter, Oracle Data Integrator Acies Technology Implementation practice is seeking a lead Data management and ETL expert for handling data mapping, transformation, ETL processes and client communication across all of Acies Product implementations. The ideal candidate will be responsible for designing, developing, and implementing efficient ETL processes, ensuring data integrity, and leading a team of ETL developers and data mapping leads. The role requires a deep understanding of data warehousing, ETL tools, and best practices in data integration across risk and finance domain. If you are passionate about technology and thrive in a dynamic, collaborative environment, we invite you to apply and contribute to our innovative projects. Key responsibility areas: Lead the end-to-end ETL architecting, development process, from requirements gathering to implementation and maintenance. Undertake data structure mapping between the Product and customer datasets and identify scenario of any gaps. Implement rigorous data quality checks, validation scripts, and reconciliation processes to guarantee data accuracy and integrity post-migration. Ability to process scenarios during discussions related to data quality and validation logics associated with the client datasets Develop and maintain comprehensive technical documentation for all data migration processes, including design specifications, ETL mappings, and troubleshooting guides. Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions. Manage and mentor a team of ETL developers, providing technical guidance and support. Perform data profiling, validation, and quality checks to ensure data accuracy. Optimize ETL processes for performance and efficiency. Collaborate with stakeholders to define data integration strategies and requirements Collaborate with cross-functional teams to define project scope, goals, and deliverables. Develop comprehensive project plans, including timelines, resource allocation, and budgets. Ensure effective communication with stakeholders to manage expectations and project progress. Identify and mitigate project risks, troubleshooting issues to ensure project success. Strong scripting skills in languages such as Python and/or Shell scripting for data manipulation, automation, and process control. Selection process: We seek to be transparent during the selection process. While the actual process may vary from the process indicated below, the key steps involved are as follows: Personal interviews: There are expected to be at least 2 rounds of online interviews. The number of interview rounds may increase depending on the criticality and seniority of the role involved. Final discussion on career and compensation: Post final selection, a separate discussion will be set up to discuss compensation and career growth. You are encouraged to seek any clarifications you have during this discussion. Preparation required: It is recommended that you prepare on some of the following aspects before the selection process: Demonstrate knowledge about data management, ETL, data pipelines Demonstrate knowledge about SQL database, Oracle database, Python, Apache Airflow and / or Informatica For any additional queries you may have, you can send a LinkedIn InMail to us, connect with us at https://www.acies.consulting/contact-us.php or e-mail us at careers@acies.holdings. How to reach us: Should you wish to apply for this job, please reach out to us directly through LinkedIn or apply on our website career page - https://www.acies.consulting/careers-apply.html

Posted 6 days ago

Apply

6.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Roles and Responsibility Design and implement data models, data flows, and data pipelines to support business intelligence and analytics. Develop and maintain large-scale data warehouses and data lakes using various technologies such as Hadoop, Spark, and NoSQL databases. Collaborate with cross-functional teams to identify business requirements and develop solutions that meet those needs. Ensure data quality, integrity, and security by implementing data validation, testing, and monitoring processes. Stay up-to-date with industry trends and emerging technologies to continuously improve the organization's data architecture capabilities. Provide technical leadership and guidance on data architecture best practices to junior team members. Job Requirements Strong understanding of data modeling, data warehousing, and ETL processes. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Excellent problem-solving skills and ability to analyze complex business problems and develop creative solutions. Strong communication and collaboration skills to work effectively with stakeholders at all levels. Ability to design and implement scalable, secure, and efficient data architectures. Experience working in an agile environment with continuous integration and delivery.

Posted 6 days ago

Apply

5.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a skilled ETL Developer to join our team in Coimbatore. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes to support data warehousing and business intelligence initiatives. The ETL Developer will work closely with data analysts, database administrators, and business stakeholders to ensure accurate and efficient data integration. Key Responsibilities: Design, develop, and implement ETL workflows to extract, transform, and load data from various sources. Optimise ETL processes for performance, reliability, and scalability. Collaborate with data analysts and business teams to understand data requirements. Maintain and troubleshoot existing ETL jobs and pipelines. Ensure data quality, consistency, and security throughout the data lifecycle. Document ETL processes, data mappings, and workflows. Stay updated with the latest ETL tools and best practices. Installing and configuring Informatica components, including high availability; managing server activations and de-activations for all environments; ensuring that all systems and procedures adhere to organizational best practices Day-to-day administration of the Informatica Suite of services (PowerCenter, IDS, Metadata, Glossary and Analyst). Informatica capacity planning and ongoing monitoring (e.g. CPU, memory, etc.) to proactively increase capacity as needed. Qualifications & Skills: Bachelor's degree in computer science, information technology, or related field. 5 to 7 years of experience in ETL development, preferably with tools like Informatica, Talend, SSIS, or similar. Strong SQL skills and experience with relational databases (SQL Server, Oracle, etc.). Knowledge of data warehousing concepts and architecture. Familiarity with scripting languages such as Python or Shell scripting is a plus. Excellent problem-solving and communication skills. Ability to work independently and as part of a team.

Posted 6 days ago

Apply

90.0 years

4 - 7 Lacs

Hyderābād

Remote

Overview: Shure is a global leader in professional audio electronics with a history of product innovation spanning over 90 years. The product portfolio includes superior hardware and software audio products used in hundreds of audio applications across multiple vertical markets. The success and reputation of the Shure brand has been defined by continuous commitment to Total Quality. Shure products are sold in over 100 countries and continues to be voted as one of Chicago's 101 Best and Brightest Companies to work for. Shure has two manufacturing facilities (China and Mexico) and multiple distribution centers (3PL and Shure owned) around the globe. Shure has embarked on a multi-year, multi phased S4 journey. A green field approach is being taken to this transformational journey. The first two releases of S4 have been successfully implemented – Finance and Procure to Pay (Ariba/S4). The goal is to complete the S4 journey and retire ECC over the next 2 years. The application eco system leveraged to enable key business processes globally at Shure spans across different technologies including SAP (S4, ECC, CPI, IBP, Hybris), Salesforce, Boomi, ShipERP, Amber Road Customs, Integration Point Customs, Loftware, Data Platform (Informatica, AWS/Red Shift), Tableau, Sharepoint, Service Now, etc. We are looking for a passionate, motivated, high-performing hands-on Sr. analyst in the logistics area including Logistics Execution, Warehouse management and Trade Compliance. This individual will be responsible for enabling new capabilities, in addition provide application support including enhancements. This role in the Global IT SAP Team will report to Associate Director, SAP Business applications and requires exposure to industry best practices, technology trends, the willingness to learn new technologies, collaborate with the global team, wear multiple hats and balance priorities to continuously drive value. The SAP Senior Analyst EWM/LE position will collaborate closely with the internal IT associates and business users located in US, Europe and Asia to successfully build, enhance and support solutions that maximize business value. Responsibilities: Responsible for providing solution expertise, solution design, performing solution build/configuration and implementing integrated end to end solutions Collaborate and communicate with business stakeholders globally, understand business requirements, provide deep SAP Functional expertise in the related process areas, demonstrate knowledge of key integration points, perform fit / gap analysis, proto-type and frame up solution design options for decision making Adhere to IT guiding principles of leveraging standard business processes, out of the box functionality, Low code/no-code, minimal customization, drive business value, business outcomes and enable positive customer experience Continually review evolving SAP technologies, assess potential impact, and propose innovative solutions and/or enhancements to existing business processes. Provide impact analysis and inputs as needed around proposed enhancements to existing solutions or new solutions Provide L1/L2/L3 application support in line with established SLAs; Assist with the resolution of Support tickets, Problem Management issues as applicable Collaborate with IT Digital Commerce, Engineering, Salesforce team and Data team to contribute to solution design, integration with SAP, testing and implementation Collaborate and support the SAP development & security team located in US and India during development, testing and implementation Understand the data platform capabilities and support the implementation of the new platform during blueprint, testing and go live; Support the newly established Data Governance processes Ensure solution design proposal adhere to the established security and data standards. Participate and support all phases of the project including planning, blueprint, solutioning, development, testing and support as needed Other duties as assigned. Qualifications: Bachelor’s degree in Computer Science or related field. Minimum of 5 years of wide-ranging experience in enterprise systems implementation (including a couple of full life cycle implementations), solution building and support of a live SAP environment with responsibility for results, including costs and methods covering SAP EWM, Logistics Execution (LE), Trade Compliance Experience with SAP Material Management (SAP MM) and SAP Sales and Distribution (SAP SD) a plus. Deep experience of key integration points across SAP modules and technical components. Understanding of SAP technical tools and functionalities (LSMW, IDOCs), SAP S/4HANA experience is a plus. Experience with proven project management methodologies Process-oriented with high attention to detail in exercising experience-based judgement to determine appropriate methods and actions. Excellent written and verbal communication skills including presentation skills Excellent problem solving and root cause analysis skills. Able to quickly learn new concepts and technology Able to follow processes and operational policies in selecting methods and techniques for obtaining solutions. Willingness to travel to remote facilities Overtime if projects dictate Key Competencies: Ø Adaptability Ø Critical Thinking Ø Customer Focus Ø Decision Quality Ø Communication Ø Leadership skills Ø Drive for Results Ø Integrity and Trust Ø Priority Setting Ø Relationship Building Ø Analytical Skills Ø Teamwork and collaboration Ø Influence WHO WE ARE. Shure’s mission is to be the most trusted audio brand worldwide – and for nearly a century, our Core Values have aligned us to be just that. Founded in 1925, we are a leading global manufacturer of audio equipment known for quality, reliability, and durability. We engineer microphones, headphones, wireless audio systems, conferencing systems, and more. And quality doesn’t stop at our products. Our talented teams strive for perfection and innovate every chance they get. We offer an Associate-first culture, flexible work arrangements, and opportunity for all. Shure is headquartered in United States. We have more than 35 regional sales offices, engineering hubs, and manufacturing facilities throughout the Americas, EMEA, and Asia. THE MIX MATTERS Don’t check off every box in the job requirements? No problem! We recognize that every professional journey is unique and are committed to providing an equitable candidate experience for all prospective Shure Associates. If you’re excited about this role, believe you’ve got the skills to be successful, and share our passion for creating an inclusive, diverse, equitable, and accessible work environment, then apply!

Posted 6 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Process Manager - GCP Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store,organize,and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical and Functional Skills: Bachelors Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 6 days ago

Apply

0.0 years

6 - 9 Lacs

Hyderābād

On-site

Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities and Tasks: Understand the Business Problem and the Relevant Data Maintain an intimate understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture the relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting options Architect Data Management Systems: Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements Plan methods for archiving/deletion of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data. Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static) Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model Develop processes to efficiently load the transform data into the data management system Prepare Data to Meet Analysis Requirements: Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) Develop and code data extracts Follow standard methodologies to ensure data quality and data integrity Ensure that the data is fit to use for data science applications Qualifications and Experience: 0-7 years of experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, multi-functional environment Minimum of 2 years’ experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) Software development. Ability to travel as needed Education: B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred. About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.

Posted 6 days ago

Apply

5.0 years

8 - 10 Lacs

Hyderābād

On-site

Job title : Location : Hyderabad % of travel expected : Travel required as per business need, if any Job type : Permanent and Full time Our Team: Sanofi Business Operation (SBO) is an internal Sanofi resource organization based in India and is setup to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions. SBO strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi, Globally. Main responsibilities: The overall purpose and main responsibilities are listed below: At our Sanofi we are leveraging analytics and technology, on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Deliverables support planning and decision making across multiple functional areas such as finance, manufacturing, product development and commercial. In addition to ensuring high-quality deliverables, our team drives synergies across the franchise, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. As we endeavor, we are seeking a dynamic talent for the role of “Senior Analyst – Data & Process Management”. We are looking for a team member to support our data management team based out of US. Robust data management is a priority for our businesses, as the product potential has major implications to a wide range of disciplines. It is essential to have someone who understands and aspires to implement innovative techniques to drive our data management strategies across franchises. He/she will ensure on time and accurate delivery of data requirements by collaborating with relevant stakeholders. He/she will ensure Data availability, data quality and data completeness are maintained as per requirements and are delivered in timely manner. Ensuring data consistency across the sources and downstream teams. Pro-actively identifying data requirements and gaps in system. Developing SOPs for data processes and leading the process enhancements. Providing training to end users on usage of data across the sources. Building advance tools and automate or improve processes for analytical and other needs People: Maintain effective relationship with the end stakeholders within the allocated GBU and tasks – with an end objective to develop education and communication content as per requirement Actively lead and develop SBO operations associates and ensure new technologies are leveraged Initiate the contracting process and related documents within defined timelines; and Collaborate with global stakeholders for project planning and setting up the timelines and maintaining budget Performance: Ensure data supplied by MDM is used effectively by stakeholders in commercial operations processes (forecasting targeting, call planning, alignments, field reporting, incentive compensation). Administer MDM activities related to the sales operations quarterly cycle. Monitor data quality reports and investigate problems. Maintain requirements documents, business rules and metadata. Provide first-level support for sales data inquiries. Provide ad-hoc support to US MDM colleagues. Works to develop deal tracking analytics and reporting capabilities Collaborates with Digital to enhance data access across various sources, develop tools, technology, and process to constantly improve quality and productivity Process: Contribute to overall quality enhancement by ensuring high standards for the output produced by the digital and data management team Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards Management of Quarterly operations for Customer Data on basis of specific frequency (weekly/monthly/quarterly/annually), along with QC checks for each refresh Manage opt out compliance for universe of healthcare professionals. Stakeholder: Work closely with global teams and/ external vendors to ensure the end-to-end effective project delivery of with complete data availability. About you Experience : 5+ years of experience in pharmaceutical product commercial sales and Customer datasets, data governance and data stewardship. In-depth knowledge of common databases like Onekey, DHC, Medpro, DDD, IQVIA, XPO, LAAD CRM, IQVIA, APLD etc. Soft skills : Strong learning agility; Ability to manage ambiguous environments, and to adapt to changing needs of the business; Good interpersonal and communication skills; strong presentation skills a must; Team player who is curious, dynamic, result oriented and can work collaboratively; Ability to think strategically in an ambiguous environment; Ability to operate effectively in an international matrix environment, with ability to work across time zones; Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Technical skills : At least 5+ years direct experience with pharmaceutical sales data and data management with the emphasis on syndicated data, Specialty Pharmacy, Customer data, Patient data, Sales Data and Vaccines. Strong technical background in Reltio, SQL, Python, AWS, Snowflake, SQL, Python, Informatica, Dataiku etc. Strong knowledge of pharmaceutical sales and marketing data sources (IQVIA, Veeva etc.) Knowledge of and/or experience in pharmaceuticals sales operations; understands how data is applied in a pharmaceutical’s commercial operations context. Ability to translate business needs into data requirement Understands the basic principles of data management and data processing. Understands the basic principles of data governance and data stewardship (data quality). Strong communication skills, including the ability to communicate the data management subject matter to a non-technical/unfamiliar internal customer. Experience of using analytical tools like Power BI, VBA and Alteryx etc. is a plus. Proficient of Excel/word/power point. An aptitude for problem solving and strategic thinking and ensuring high quality data output with strong quality assurance Ability to synthesize complex information into clear and actionable insights Proven ability to work effectively across all levels of stakeholders and diverse functions Solid understanding of pharmaceutical development, manufacturing, supply chain and marketing functions Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Effectively collaborate across differing levels of management, functions and role Strong decision-making skills, identifying key issues, developing solutions and gaining commitment Education : - Bachelor’s degree from an accredited four-year college or university. Advanced degree in areas such as Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e.g., PhD / MBA / Masters) Languages : Excellent knowledge in English and strong communication skills – written and spoken Pursue Progress, discover Extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!

Posted 6 days ago

Apply

7.0 - 10.0 years

4 - 7 Lacs

Hyderābād

On-site

Summary #LI-Onsite Data Steward has accountability for day-to-day management of data. They are the Subject Matter Experts who understand and communicate the meaning and use of information. They are responsible for working with the Data & Business Owners to implement data quality standards & process. About the Role Key Responsibilities: Execute data stewardship tasks, using common methods & tools. Collaborate with business in defining business rules for the data and documenting metadata for various data elements Collaborate with Data Governance team providing input for Data Standards and Process as per insights gained from the data Develop good understanding of Finance business processes, end-to-end business and data functionality Work closely with the Data Owners, Data Governance and Data Quality, Global Process Owner (GPO) to ensure execution of data stewardship tasks as per aligned stewardship process and standards Liaise with the Functional Data Owners, Business Owner, Data Maintainers, to discuss and resolve Data Quality issues. Continuously monitor the progress of Data Quality KPIs and ensure adherence Ensure continuous and effective communication with relevant team members, stakeholders and colleagues in relation to stewardship activities. Review and approve data exceptions for the data created by Data Owner/Maintenance team Collaborate effectively with data community, to facilitate shared learning between Business Users and Stewards and to promote active Data Quality Governance through the Finance Master Data Team. Adherence to the Novartis Values & Behaviors Ensure exemplary communication with all stakeholders including internal associates through regular updates with focus on accomplishments, KPIs, best practices, change management, key events etc. Implement continuous process improvement projects to improve data quality & productivity. Implementation of Data Quality Strategy & framework .Ensure to maintain the Quality of Master Data throughout the business process Provide guidance and set standards of functional excellence in methodologies, processes and SOPs to enable enhancement of Global & Local data operations Essential Requirements: Bachelor/MBA/Master’s degree from reputed University in Finance, Pharma, Computers or IT or equivalent 7-10 years of experience working as a data steward for key business functions such as Finance, Pharmaceutical, Healthcare Hands-on experience in working in Data Quality, Data Governance, Master data and data management domain Hands-on experience in Collibra, Informatica Data Quality, Informatica Analyst, Ataccama, Alation or any such tools. Familiar with process set-up, Data Quality KPIs and operational issues / management Exposure to tools like Power BI, Service Now, Jira, Confluence, Excel, PowerPoint, and SharePoint for analysis & documentation. Strong understanding of data models, data lifecycle, and enterprise systems (e.g., SAP ECC/S4 Hana, SAP EDW). Proficiency in Data Stewardship process, Data Quality monitoring and issue remediation Excellent analytical, communication, Presentation, and stakeholder management skills Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division Finance Business Unit Corporate Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Audit & Finance Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.

Posted 6 days ago

Apply

2.0 years

4 - 8 Lacs

Hyderābād

On-site

Job title : Analyst - Data & Process Management Location : Hyderabad % of travel expected : Travel required as per business need, if any Job type : Permanent and Full time About the job Our Team: Sanofi Business Operation (SBO) is an internal Sanofi resource organization based in India and is setup to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions. SBO strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi, globally. Main responsibilities: The overall purpose and main responsibilities are listed below: At our Sanofi we are leveraging analytics and technology, on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Deliverables support planning and decision making across multiple commercial areas such as Analytics, Campaign Ops and market mix. In addition to ensuring high-quality deliverables, our team drives synergies across the franchise, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. As we endeavor, we are seeking a dynamic talent for the role of “Analyst – Data & Process Management”. We are looking for a team member to support our data management team based out of US. Robust data management is a priority for our businesses, as the product potential has major implications to a wide range of disciplines. It is essential to have someone who understands and aspires to implement innovative techniques to drive our data management strategies across franchises. He/she will ensure on time and accurate delivery of data requirements by collaborating with relevant stakeholders. He/she will ensure Data availability, data quality and data completeness are maintained as per requirements and are delivered in timely manner. Ensuring data consistency across the sources and downstream teams. Pro-actively identifying data requirements and gaps in system. Developing SOPs for data processes and leading the process enhancements. Providing training to end users on usage of data across the sources. Building advance tools and automate or improve processes for analytical and other needs People: Maintain effective relationship with the end stakeholders within the allocated GBU and tasks – with an end objective to develop education and communication content as per requirement Actively lead and develop SBO operations associates and ensure new technologies are leveraged Initiate the contracting process and related documents within defined timelines; and Collaborate with global stakeholders for project planning and setting up the timelines and maintaining budget Performance: Ensure data supplied by CDM is used effectively by stakeholders in commercial operations processes (forecasting targeting, call planning, alignments, field reporting, incentive compensation). Administer CDM activities related to the sales operations quarterly cycle. Monitor data quality reports and investigate problems. Maintain requirements documents, business rules and metadata. Provide first-level support for sales data inquiries. Provide ad-hoc support to US CDM colleagues. Works to develop deal tracking analytics and reporting capabilities Collaborates with Digital to enhance data access across various sources, develop tools, technology, and process to constantly improve quality and productivity Process: Contribute to overall quality enhancement by ensuring high standards for the output produced by the digital and data management team; and Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards Refresh reports on frequency/cycle basis (weekly/monthly/quarterly/annually), along with QC checks for each refresh Manage opt out compliance for universe of healthcare professionals. Stakeholder: Work closely with global teams and/ external vendors to ensure the end-to-end effective project delivery of with complete data availability. About you Experience : 2+ years of experience in pharmaceutical product commercial omni channel datasets, data governance and data stewardship. In-depth knowledge of common databases like IQVIA, APLD, SFMC, Google analytics, HCP Engagement data and execution data, etc. Soft skills : Strong learning agility; Ability to manage ambiguous environments, and to adapt to changing needs of the business; Good interpersonal and communication skills; strong presentation skills a must; Team player who is curious, dynamic, result oriented and can work collaboratively; Ability to think strategically in an ambiguous environment; Ability to operate effectively in an international matrix environment, with ability to work across time zones; Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Technical skills : At least 2+ years direct experience with pharmaceutical sales data and data management with the emphasis on syndicated data, Specialty Pharmacy and digital/omnichannel data. Strong technical background in AWS, Snowflake, Data Bricks, SQL, Python, Informatica, Dataiku etc. Strong knowledge of pharmaceutical sales and marketing data sources (IQVIA, Veeva etc.) Knowledge of and/or experience in pharmaceuticals sales operations; understands how data is applied in a pharmaceutical’s commercial operations context. Ability to translate business needs into data requirement Understands the basic principles of data management and data processing. Understands the basic principles of data governance and data stewardship (data quality). Strong communication skills, including the ability to communicate the data management subject matter to a non-technical/unfamiliar internal customer. Experience of using analytical tools like Power BI, VBA and Alteryx etc is a plus. Proficient of Excel/word/power point. An aptitude for problem solving and strategic thinking and ensuring high quality data output with strong quality assurance Ability to synthesize complex information into clear and actionable insights Proven ability to work effectively across all levels of stakeholders and diverse functions Solid understanding of pharmaceutical development, manufacturing, supply chain and marketing functions Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Effectively collaborate across differing levels of management, functions and role Strong decision-making skills, identifying key issues, developing solutions and gaining commitment Education : Advanced degree in areas such as Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e.g., PhD / MBA / Masters) Languages : Excellent knowledge in English and strong communication skills – written and spoken Pursue Progress, discover Extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people. At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!

Posted 6 days ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

Job Summary: We are seeking a highly experienced QA professional with over 3+ years of experience to join our Quality Assurance team for the data migration project. The ideal candidate will have a strong background in ETL testing, data validation, and migration projects, with expertise in creating test cases and test plans, as well as hands-on experience with data migration to cloud platforms like Snowflake. The role requires leadership capabilities to manage testing efforts, including coordinating with both on-shore and off-shore teams, ensuring seamless collaboration and delivery. Proficiency in ETL tools like Talend (preferred), Informatica PowerCenter, or DataStage is essential, along with a solid understanding of SQL and semi-structured data formats such as JSON and XML. Key Responsibilities: Develop and implement comprehensive test strategies and plans for data migration projects, ensuring full coverage of functional and non-functional requirements. Create detailed test cases, test plans, and test scripts for validating data migration processes and transformations. Conduct thorough data validation and verification testing, leveraging advanced SQL skills to write and execute complex queries for data accuracy, completeness, and consistency. Utilize ETL tools such as Talend, Informatica PowerCenter, or DataStage to design and execute data integration tests, ensuring successful data transformation and loading into target systems like Snowflake. Validate semi-structured data formats (JSON, XML), ensuring proper parsing, mapping, and integration within data migration workflows. Lead testing efforts for data migration to cloud platforms, ensuring seamless data transfer and integrity. Act as the QA Lead to manage and coordinate testing activities with on-shore and off-shore teams, ensuring alignment, timely communication, and delivery of quality outcomes. Document and communicate test results, defects, and issues clearly to the development and project teams, ensuring timely resolutions. Collaborate with cross-functional teams to create and maintain automated testing frameworks for ETL processes, improving testing efficiency and coverage. Monitor adherence to QA best practices and standards while driving process improvements. Stay updated on the latest QA tools, technologies, and methodologies to enhance project outcomes. Qualifications: 3+ years of experience in Quality Assurance, focusing on ETL testing, data validation, and data migration projects. Proven experience creating detailed test cases, test plans, and test scripts. Hands-on experience with ETL tools like Talend (preferred), Informatica PowerCenter, or DataStage. Proficiency in SQL for complex query writing and optimization for data validation and testing. Experience with cloud data migration projects, specifically working with databases like Snowflake. Strong understanding of semi-structured data formats like JSON and XML, with hands-on testing experience. Strong analytical and troubleshooting skills for resolving data quality and testing challenges. Preferred Skills: Experience with automated testing tools and frameworks, particularly for ETL processes. Knowledge of data governance and data quality best practices. Familiarity with AWS or other cloud-based ecosystems. ISTQB or equivalent certification in software testing.

Posted 6 days ago

Apply

0 years

5 - 10 Lacs

Gurgaon

On-site

Project description We've been engaged by a large European Bank to provide resources to their Markets Program development team working on a wide range of projects. We require an experienced Senior SSIS Developer to work within the existing Team. Responsibilities Have a continuous improvement mindset. Take part in software design activities & discussions. Write production-quality code. Work collaboratively in the team. Skills Must have 5+ SQL Server Integration Services (SSIS) experience and ETL skills SQL Server 2016. You should have excellent knowledge in SQL query and SSIS. The expectation is to get the existing complex queries/SSIS understood and improve performance wherever required. Software: SQL Server 2012/2016/2017/onwards, SQL Integration Services 2008/2012/2016/2017. Tools: Jira, Team City, Git. Programming Languages: C#, VB.NET. Exceptional understanding of and experience with relational datasets, data warehouses, and ETL techniques. Experience in developing stored procedures. Performance tuning SQL Server databases. Extensive knowledge and experience developing stored procedures, views, and functions using Microsoft SQL Server. Light database administration skills (indexing, tuning, optimization, etc. Ability to write efficient, complex queries against very large data sets. Ability to model, analyze, and manage large volumes of data. Create jobs and schedules to automate ETL packages within SSIS. Strong analytical and quantitative skills. Experience in different phases of Software Development Life Cycle (SDLC), including Planning, Design, Development, and Testing during the development of software applications. Experience in Cloud computing and Amazon Web Services. Working knowledge of using GitHub, Rally/Jira, and Artifactory. Nice to have Basic knowledge of click view and control M is beneficial but not mandatory. A track record working as a developer in the Financial Services Industry and/or Treasury function or exposure to loans, interest rates, and balance sheet-related applications/datasets is desirable but not mandatory. Other Languages English: B1 Intermediate Seniority Regular Gurugram, India Req. VR-115414 ETL (Informatica, Ab Initio etc.) BCM Industry 26/06/2025 Req. VR-115414

Posted 6 days ago

Apply

0 years

2 - 4 Lacs

Noida

Remote

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant – Production Support As Analyst IT Operations, you will be involved in 24*7 L2 Production Support in an onshore-offshore model and is assigned to receive, analyze and identify solutions for all types of priority production issues/tickets. You will be the Primary technical got-to resource, work in shifts on rotation basis, and may have to provide on-call support in 24*7 environment, and during non-business hours when needed. You should be able to handle the technical issues independently and must be able to exhibit being hands-on, pro-active, with flexible schedule and quick adapter to meet organizational needs. Specific responsibilities of the beneficiary include (but not limited to), supporting the production support team and their work assignments, service levels, design & execute process improvements and undertake tasks as needed to allow the production support Manager to focus on the overall service improvements. Resource will be the key in identifying remediation tasks and implement solutions to reduce number of production incidents, Interaction with business users for requirement clarification and resolution of incidents reported, responsible for root cause analysis for major incidents. Resource will be the key in providing workaround solution of production batch issues, assign tickets to team members, track fixing of issues and verification of the break-fixes, performing root cause analysis of issues and arriving at a permanent fix to avoid recurrence of the issues, process improvement to improve system stability and performance. Ensure compliance with corporate standards, policies and regulations (SOX, PHI, and PII etc ). This role will collaborate with teams spanning multiple business units Offshore: All are on rotational basis: M-F rotation among two shifts 9 hours each between (6:30 AM IST- 3:30 PM IST or 12:30 – end 9:30 PM IST: Work from office India). Sat-Sun & US Eastern Holiday : rotation among 3 shifts (6:30 AM IST- 3:30 PM IST or 12:30 – end 9:30 PM IST or 9:15 PM IST-6:30 AM IST): Work from office India Responsibilities Troubleshooting batch events in technologies listed below Perform analysis & code deployment of recurring problems, provide recommendations and work with outside teams to identify/implement solutions Perform analysis and provide recommendations to enhance support processes such as documentation and/or monitoring . E xperience in Production support/maintenance in an onshore-offshore model environment Must have and proven strong Analytical, positive attitude towards analyzing, recovery and fix an issue Must have hands-on development experience and can deep dive into issues in the technologies related to: Unix/Linux shell scripting, PL/SQL, Informatica 10.4 (Power Center), Additionally: SQL Developer/Toad, Putty, Control M Scheduling , Teradata SQL Assistant, Teradata Viewpoint, SAP etc Release/Deployment, Operational Readiness, Production Governance is a must . Assure quality, security and compliance requirements are met for supported areas. Must be a team player and this position will work closely with other vendors and internal technical partners Analyze performance trends and recommend process improvements to ensure SLA's are met. Must be a self-starter, quick learner/adapter of new/required technologies to meet the project requirements to support day-2-day support activities Ability to understand complex problems, identify root causes and remain goal-oriented within a dynamic environment Working experience in Retail/Pharmacy area is a plus Willing to learn new technologies as needed Proven high performer, demonstrated by consistent high-performance reviews, exceptional customer service management Basic understanding of application development/architecture, change management, incident and problem management is a plus. No Remote/work from Home preferred Qualifications we seek in you! Minimum Qualifications /skills Bachelor's degree or foreign equivalent in Computer Science, Information Systems, Engineering, or related field is required. Working experience in Retail/Pharmacy area is a plus Willing to learn new technologies as needed Preferred qualifications /skills Computer competency and literacy in Microsoft Windows applications especially with Excel, Word, PowerPoint , Outlook is a must. Strong interpersonal and communication skills, including the ability to interact at all levels of the organization in person and over email is a plus.( Senior Leadership, Business Leadership, Technical resources, 3 rd party vendors) Commitment to shift is a must Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com Follow us on X, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 27, 2025, 5:11:24 AM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time

Posted 6 days ago

Apply

0 years

0 Lacs

Noida

On-site

Req ID: 328474 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica, PL/SQL, Power center, IICS to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 6 days ago

Apply

7.0 - 10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Req ID: 329228 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Platform Administrator to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job highlights Qualifications Excellent oral and written English communication skills Specifically needs to be able to communicate well on phone calls Must be able to thrive under pressure and have a strong sense of ownership and responsibility for the project Minimum of 7-10 years of experience in the administration, architecture, design, and development of ETL programs using Informatica’s data integration tools; Informatica experience must include working with large mission critical data systems (multiple terabytes in size with millions of rows processing per day) Must be able to demonstrate mastery of Informatica command line utilities Must be able to demonstrate mastery of quickly debugging and resolving issues that can arise with Informatica programs Must know how to navigate through the Informatica metadata repository Expert knowledge of Informatica version 8.5 and above Responsibilities This person will be part of the team that supports a growing Data Warehousing and Business Intelligence environment using Informatica, Oracle 11g/10g RDBMS, and SQL Server 2005/2008, Kibana , Elastic Search This individual will be responsible for the following: Administering the Informatica environment on UNIX , Hands-on experience with Elastic Search Administrator. Apply patches / upgrades / hotfixes for Informatica PowerCenter Elasticsearch administration (cluster setup, Fleet server, Agent, Logstash, kibana, data modelling concepts) and Elastic Stack Proficiency with Elasticsearch DSL for complex query development Create ETL processes using Ingest Pipelines Monitor performance, troubleshoot, and tune ETL processes Will be responsible for 24x7 support Work with Informatica vendor on any issues that arise Support the Informatica developer team as needed Ensure development teams are following appropriate standards Promote and deploy Informatica code Assist development team at design time ensuring code will run at scale Serve as an Informatica Developer as needed to get work completed Write and maintain BASH shell scripts for administering the Informatica environment Design and implement appropriate error handling procedures Develop project, documentation, and ETL standards in conjunction with data architects Work with other team members to ensure standards around monitoring and management of the Informatica environment within the context of existing enterprise monitoring solutions About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies