Home
Jobs

2667 Data Quality Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and maintain data pipelines for efficient data processing.- Implement ETL processes to ensure seamless data migration.- Collaborate with cross-functional teams to optimize data solutions.- Conduct data quality assessments and implement improvements.- Stay updated on industry trends and best practices for data management. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and database design.- Experience with cloud-based data platforms like AWS or Azure.- Hands-on experience with SQL and scripting languages like Python.- Knowledge of data governance principles and practices. Additional Information:- The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark Good to have skills : AWS GlueMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and optimize data workflows, ensuring that the data infrastructure supports the organization's analytical needs effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processing workflows to enhance efficiency and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with AWS Glue.- Strong understanding of data pipeline architecture and design.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business AgilityMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities:- Need Databricks resource with Azure cloud experience- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with data architects and analysts to design scalable data solutions.- Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Business Agility.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud platforms and services related to data storage and processing. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

15.0 - 25.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Project Role : IoT Architect Project Role Description : Design end-to-end IoT platform architecture solutions, including data ingestion, data processing, and analytics across different vendor platforms for highly interconnected device workloads at scale. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing. Roles & Responsibilities:1. Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. 2. Focused on designing, building, and managing the data architecture of industrial systems. 3. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. 4. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. 5. Create scalable and secure data structures, integrate with existing systems, and ensure efficient data flow. Professional & Technical Skills: 1. Must have Skills: Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life Science. 2. Data Modeling and Architecture:Proficiency in data modeling techniques (conceptual, logical, and physical models). 3. Knowledge of database design principles and normalization. 4. Experience with data architecture frameworks and methodologies (e.g., TOGAF). 5. Database Technologies:Relational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. 6. NoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data. 7. Graph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework). 8. Query Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. 9. Data Integration and ETL (Extract, Transform, Load):Proficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi). 10. Experience with data integration tools and techniques to consolidate data from various sources. 11. IoT and Industrial Data Systems:Familiarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA). 12. Experience with IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core. 13. Experience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache Flink 14. Ability to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow. 15. Understanding event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ. 16. Exposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge 17. AI/ML, GenAI:Experience working on data readiness for feeding into AI/ML/GenAI applications 18. Exposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. 19. Cloud Platforms:Experience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). 20. Data Warehousing and BI Tools:Expertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery). 21. Proficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. 22. Data Governance and Security:Understanding data governance principles, data quality management, and metadata management. 23. Knowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. 24. Big Data Technologies:Experience with big data platforms and tools such as Hadoop, Spark, and Apache Kafka. 25. Understanding distributed computing and data processing frameworks. Additional Info1. A minimum of 15-18 years of progressive information technology experience is required. 2. This position is based at Bengaluru location. 3. A 15 years full-time education is required. 4. AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark Good to have skills : AWS GlueMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. You will also monitor and optimize existing data processes to enhance performance and reliability, making data accessible and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with data architects and analysts to design data models that meet business needs.- Develop and maintain documentation for data processes and workflows to ensure clarity and compliance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with AWS Glue.- Strong understanding of data processing frameworks and methodologies.- Experience in building and optimizing data pipelines for performance and scalability.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 3 years of experience in Apache Spark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Your Role and Responsibilities The Marketing Database Specialist is a vital role that will work closely across the entire Marketing to increase marketing database fidelity and growth. This individual will help add potential leads, cleanup data, conduct data research, assist with data automation and follow direction from our database owners as needed. What we want you to do: Upload data lists to Salesforce.com, Ringlead, or other customer database tools. Cleanup incomplete or incorrectly formatted data. Do extensive research to sleuth missing data fields via off-the-shelf tools or other means. Assist in any data related automation and ad hoc tasks. Provide low level reporting. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Basic Qualifications & Competencies: 3-5 years’ experience in a customer marketing/sales database role. Proven experience with Salesforce.com. Experience with Data Quality tools (Ringlead or others). Expert skills in Microsoft Excel. Exceptional attention to detail and a bias for action. Experience in a corporate environment is preferred, with benefit from those in the B2B Software SaaS space. Not required, but beneficial is experience with PPM (Project Portfolio Management) tools such as Workfront/Wrike/others or experience with Project Management tools like Jira, Trello, Asana. Effective communicator with an urgency to proactively communicate prior to issues on quality or timeliness. Preferred technical and professional experience Hiring manager and Recruiter should collaborate to create the relevant verbiage.

Posted 5 days ago

Apply

15.0 - 20.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Project Role : IoT Architect Project Role Description : Design end-to-end IoT platform architecture solutions, including data ingestion, data processing, and analytics across different vendor platforms for highly interconnected device workloads at scale. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing. Roles & Responsibilities:1. Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. 2. Focused on designing, building, and managing the data architecture of industrial systems. 3. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. 4. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. 5. Create scalable and secure data structures, integrate with existing systems, and ensure efficient data flow. Professional & Technical Skills: 1. Must have Skills: Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life Science. 2. Data Modeling and Architecture:Proficiency in data modeling techniques (conceptual, logical, and physical models). 3. Knowledge of database design principles and normalization. 4. Experience with data architecture frameworks and methodologies (e.g., TOGAF). 5. Database Technologies:Relational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server. 6. NoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data. 7. Graph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework). 8. Query Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. 9. Data Integration and ETL (Extract, Transform, Load):Proficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi). 10. Experience with data integration tools and techniques to consolidate data from various sources. 11. IoT and Industrial Data Systems:Familiarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA). 12. Experience with IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core. 13. Experience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache Flink 14. Ability to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow. 15. Understanding event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ. 16. Exposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge 17. AI/ML, GenAI:Experience working on data readiness for feeding into AI/ML/GenAI applications 18. Exposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. 19. Cloud Platforms:Experience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). 20. Data Warehousing and BI Tools:Expertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery). 21. Proficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. 22. Data Governance and Security:Understanding data governance principles, data quality management, and metadata management. 23. Knowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. 24. Big Data Technologies:Experience with big data platforms and tools such as Hadoop, Spark, and Apache Kafka. 25. Understanding distributed computing and data processing frameworks. Additional Info1. A minimum of 15-18 years of progressive information technology experience is required. 2. This position is based at Bengaluru location. 3. A 15 years full-time education is required. 4. AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Qualification 15 years full time education

Posted 5 days ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with data warehousing solutions.- Strong understanding of ETL processes and tools.- Familiarity with data governance and data quality frameworks.- Experience in programming languages such as Python or SQL for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 5 days ago

Apply

10.0 - 20.0 years

45 - 50 Lacs

Mumbai

Work from Office

Data Analyst, GSA Data Divisional Office - Risk, Finance and Treasury, AVP Your responsibilities will focus on management of change-the-bank activities to uplift our data capabilities to meet recently revised regulatory and internal policies and standards. This is an exciting opportunity to collaborate with various groups who actively originate and consume data, including business lines, infrastructure functions and large-scale change programmes. As part of your role, you will gain a thorough understanding of how data is an integral component of our business, with oversight of GCOO data assets, the enterprise data management lifecycle and upholding expected standards to maintain global Regulatory compliance. Deutsche Bank is investing heavily in optimizing our business processes and regulatory outcomes by using data in the best ways possible, and you will be directly shaping the strategy to do so. Mandatory Text Do not delete Group Strategic Analytics is part of Group Chief Operation Office (GCOO) which acts as the bridge between the Banks businesses and infrastructure functions to help deliver the efficiency, control, and transformation goals of the Bank. Your key responsibilities Assigned to several data strategy projects including related governance and KPIs Key GCOO DDO Data Analyst / Point of Contact of relevant Change The Bank and Run The Bank data priorities in partnership with Divisional & Regional Stakeholders. Monitor data management related regulatory changes and gap analysis to DB processes. Drive implementation of Data Quality Control Framework to ensure completeness / accuracy / timeliness for the GCOO mandated scope of data, ensuring compliance with Strategic Data KPIs Identify most critical and strategic data to be brought under governance and facilitate right sourcing via strategic platforms Support relationships with all relevant internal and external stakeholder groups, representing the GCOO data strategy and DDO function. Your skills and experience 10+ years of experience in Banking, ideally data-management related roles Experience of managing complex change programmes (ideally cross-divisional and cross-location) Data analysis: ability to investigate and present details of lineage, completeness, and transformations performed upon data via flows and processes Ideally experience in the usage of Industry standard data management tools such as Collibra and Solidatus Demonstrable experience in translating Core Data Standards into practical implementation. Excellent organizational skills and a high attention to detail with the ability to work under pressure and to deliver to tight deadlines. Excellent interpersonal skills with demonstrable ability to engage and influence stakeholders. Strong communication skills, both written and verbal, ability to explain complex problem in a clear and structure way.

Posted 5 days ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai

Work from Office

The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role and responsibilities: Represent eClerx in client pitches and external forums. Own platform and expertise through various COE activities and content generation to promote practice and business development. Lead continuous research and assessments to explore best and latest platforms, approaches, and methodologies. Contribute to developing the practice area through best practices, ideas, and Point of Views in the form of white papers and micro articles. Lead/partner in multi-discipline assessments and workshops at client sites to identify new opportunities. Lead key projects and provide development/technical leadership to junior resources. Drive solution design and build to ensure scalability, performance, and reuse. Design robust data architectures, considering performance, data quality, scalability, and data latency requirements. Recommend and drive consensus around preferred data integration and platform approaches, including Azure and Snowflake. Anticipate data bottlenecks (latency, quality, speed) and recommend appropriate remediation strategies. This is a hands-on position with a significant development component, and the ideal candidate is expected to lead the technical development and delivery of highly visible and strategic projects. Technical and Functional skills: Bachelor's Degree with at least 2-3 large-scale Cloud implementations within Retail, Manufacturing, or Technology industries. 10+ years of overall experience with data management and cloud engineering. Expertise in Azure Cloud, Azure Data Lake, Databricks, Snowflake, Teradata, and compatible ETL technologies. Strong attention to detail and ability to collaborate with multiple parties, including analysts, data subject matter experts, external labs, etc.

Posted 5 days ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program ManagerRoles and responsibilities: Understand clients requirement and provide effective and efficient solution in Snowflake. Understanding data transformation and translation requirements and which tools to leverage to get the job done. Ability to do Proof of Concepts (POCs) in areas that need R&D on cloud technologies. Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Technical and Functional Skills: Masters / Bachelors degree in Engineering, Analytics, or a related field. Total 7+ years of experience with relevant ~4+ years of Hands-on experience with Snowflake utilities SnowSQL, SnowPipe, Time travel, Replication, Zero copy cloning. Strong working knowledge on Python. Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. In-depth understanding of data warehouse and ETL tools. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Experience Snowflake APIs is mandatory. Candidate must have strong knowledge in Scheduling and Monitoring using Airflow DAGs. Strong experience in writing SQL Queries, Joins, Store Procedure, User Defined Functions. Should have sound knowledge in Data architecture and design. Should have hands on experience in developing Python scripts for data manipulation. Snowflake snowpro core certification. Developing scripts using Unix, Python, etc.

Posted 5 days ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Pune

Work from Office

Job TitlePM LocationMumbai / Pune Skill- End to end KYC, Periodic Review, AML, and Due Diligence Shift Timings:APAC , EMEA , NAM Roles & Responsibilities The KYC Lifecycle Team performs detailed KYC checks on new and existing clients to fulfil KYC & AML regulatory requirements in multiple jurisdictions, allowing our clients to engage in a risk compliant manner. This role will include the understanding of KYC lifecycle and manage complete KYC process. In this role you will be liaising with our clients, front and middle office staff to obtain KYC documentation of existing customers as part of reviews, interfacing with internal stakeholders on any KYC lifecycle related queries and conducting research clients to obtain KYC data from public sources. You will also be screening entity names against sanctions, adverse news and politically exposed person, provide disposition and escalate due diligence findings to Financial Crimes Compliance Key responsibilities include: Maintain working knowledge of various internal processes including KOPs, Local Regulations & Guidelines. Publishing of MIS on status of the KYC Periodic Review to the Senior Management of the Bank (Both Regional & Local) Ensure that there are no Critical Audit Points as a result of Regulatory or Internal Audits. Rollout of any new KYC Policy / Regulations and also proper understanding of the same within the team and stakeholders. For the delivery of high standards of client service, at the same time ensuring that all internal (e.g. Risk) and external (e.g. Compliance) standards requirements are met in full. Ensure appropriate escalation policies exist and are followed. Work collectively with management, develop and maintain a motivated and professionally trained staff, ensuring appropriate capacity planning, adherence to and improvement in performance and quality standards and appropriate career development. Provide leadership support, guidance and coaching to the team. Ensure key stakeholders are kept informed of the progress and challenges and escalating issues where appropriate. Work in a high-pressure and time-sensitive environment. Perform quality checks to ensure that defined guidelines are adhered for excellent QA scores. Work as the process owner and ensure end to end management of all activities associated with the process. Ensure adherence to standards, procedures and also identify risk mitigates wherever there is a control issue. Qualification and Skills Bachelors Degree (B.Com, BBA, BBM, BCA) / Masters Degree (M.Com, MBA, PGDM). 6 to 9 years' experience in AML Compliance & KYC within the financial services industry with experience in Data Quality and Controls Work closely with other internal teams to ensure top of the line service to Clients. Prioritize tasks and ensure adherence to timelines for completion of activities Initiate and lead change management initiatives within the team. Ensure structured upward & downward communication Responsible for ensuring that cases are managed effectively and consistently in line with the agreed process; ensuring that all aspects of delivery are running effectively and if necessary, escalating issues. Liaises with multiple internal stakeholders to ensure the smooth delivery of KYC & AML services to clients. Supports the production of critical metrics and reporting which provide data related to department performance, risk quantification and stratification. Review complex KYC cases and ensure appropriate escalation to internal teams such as AML. You will be an individual contributor as a part of a team with a predetermined focused scope of work. Good experience in MS-Office applications like Excel, Word, Power point, Outlook etc eClerx Financial Markets offers consulting, technological innovation, and process management expertise to uniquely solve operational challenges for financial organizations worldwide. With nearly two decades of industry experience, complemented by smart automation and robotics, our team of experts delivers holistic solutions across the trade lifecycle, change management, data analytics, compliance, cash securities operations, document digitization and generation, and outreach.

Posted 5 days ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Pune

Work from Office

Job Title : Knowledge Anchor Designation : A/SA Program : Support Process : CO_KM Shift Details : Night Shit(Rotational) Detailed Job/ Role Description: Would be responsible for conducting onboarding, refresher and continual training program for front line and managerial workforce Would be required to adhere to internal training and documentation processes Risk Mitigation Risk Evaluation by conducting periodic audits Process Documentation / Update [BPD, Checklist, Metadata, Training Docs, SOP] Change Management The candidate would be responsible training new hire employees and conduct refresher training for on-floor employees The candidate would be responsible for driving performance of employees falling under the vintage of 0-90 days post certification Monitor calls/chats on Communication, Soft Skills, Process and Compliance Parameters Provide Coaching & Feedback to enhance agent performance Participate in calibrations to ensure consistent scoring & feedback delivery approach Keep the reps updated on new process changes/updates, improvement initiatives Prepare and implement the action plans post analyzing audit data, compliance reports, communication & Process related data points Essential skill-set required: Excellent communication skills (both verbal and written) Eye for detail Excellent facilitation skills Excellent execution skills Analytical Thinking and Problem Solving Knowledge of MS Excel, MS Word, MS PowerPoint and MS Visio Willing to take charge and initiative as per business requirements The candidate should be aware of various learning methodologies which would help the candidate in handling different types of learners Candidate should have knowledge about methodologies used for designing content Candidate should be aware of methodologies to check knowledge retention and evaluate overall training performance Candidate should have undergone and cleared one TTT mode Work experience required: The candidate should possess minimum of 2 years of experience in the same field Formal Qualifications: Basic Graduate or above Basic knowledge of computer Working knowledge of Excel, Word and PowerPoint

Posted 5 days ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Mumbai

Work from Office

Job TitlePM LocationMumbai Skill- End to end KYC, Periodic Review, AML, and Due Diligence Shift Timings:APAC , EMEA , NAM Roles & Responsibilities The KYC Lifecycle Team performs detailed KYC checks on new and existing clients to fulfil KYC & AML regulatory requirements in multiple jurisdictions, allowing our clients to engage in a risk compliant manner. This role will include the understanding of KYC lifecycle and manage complete KYC process. In this role you will be liaising with our clients, front and middle office staff to obtain KYC documentation of existing customers as part of reviews, interfacing with internal stakeholders on any KYC lifecycle related queries and conducting research clients to obtain KYC data from public sources. You will also be screening entity names against sanctions, adverse news and politically exposed person, provide disposition and escalate due diligence findings to Financial Crimes Compliance Key responsibilities include: Maintain working knowledge of various internal processes including KOPs, Local Regulations & Guidelines. Publishing of MIS on status of the KYC Periodic Review to the Senior Management of the Bank (Both Regional & Local) Ensure that there are no Critical Audit Points as a result of Regulatory or Internal Audits. Rollout of any new KYC Policy / Regulations and also proper understanding of the same within the team and stakeholders. For the delivery of high standards of client service, at the same time ensuring that all internal (e.g. Risk) and external (e.g. Compliance) standards requirements are met in full. Ensure appropriate escalation policies exist and are followed. Work collectively with management, develop and maintain a motivated and professionally trained staff, ensuring appropriate capacity planning, adherence to and improvement in performance and quality standards and appropriate career development. Provide leadership support, guidance and coaching to the team. Ensure key stakeholders are kept informed of the progress and challenges and escalating issues where appropriate. Work in a high-pressure and time-sensitive environment. Perform quality checks to ensure that defined guidelines are adhered for excellent QA scores. Work as the process owner and ensure end to end management of all activities associated with the process. Ensure adherence to standards, procedures and also identify risk mitigates wherever there is a control issue. Qualification and Skills Bachelors Degree (B.Com, BBA, BBM, BCA) / Masters Degree (M.Com, MBA, PGDM). 6 to 9 years' experience in AML Compliance & KYC within the financial services industry with experience in Data Quality and Controls Work closely with other internal teams to ensure top of the line service to Clients. Prioritize tasks and ensure adherence to timelines for completion of activities Initiate and lead change management initiatives within the team. Ensure structured upward & downward communication Responsible for ensuring that cases are managed effectively and consistently in line with the agreed process; ensuring that all aspects of delivery are running effectively and if necessary, escalating issues. Liaises with multiple internal stakeholders to ensure the smooth delivery of KYC & AML services to clients. Supports the production of critical metrics and reporting which provide data related to department performance, risk quantification and stratification. Review complex KYC cases and ensure appropriate escalation to internal teams such as AML. You will be an individual contributor as a part of a team with a predetermined focused scope of work. Good experience in MS-Office applications like Excel, Word, Power point, Outlook etc eClerx Financial Markets offers consulting, technological innovation, and process management expertise to uniquely solve operational challenges for financial organizations worldwide. With nearly two decades of industry experience, complemented by smart automation and robotics, our team of experts delivers holistic solutions across the trade lifecycle, change management, data analytics, compliance, cash securities operations, document digitization and generation, and outreach.

Posted 5 days ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai

Work from Office

The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program Manager Role and responsibilities: Represent eClerx in client pitches, external forums, and COE (Center of Excellence) activities to promote cloud engineering expertise. Lead research, assessments, and development of best practices to keep our cloud engineering solutions at the forefront of technology. Contribute to the growth of the cloud engineering practice through thought leadership, including the creation of white papers and articles. Lead and collaborate on multi-discipline assessments at client sites to identify new cloud-based opportunities. Provide technical leadership in the design and development of robust, scalable cloud architectures. Drive key cloud engineering projects, ensuring high performance, scalability, and adherence to best practices. Design and implement data architectures that address performance, scalability, and data latency requirements. Lead the development of cloud-based solutions, ensuring they are scalable, robust, and aligned with business needs. Anticipate and mitigate data bottlenecks, proposing strategies to enhance data processing efficiency. Provide mentorship and technical guidance to junior team members. Technical and Functional skills: Bachelors with 10+ years of experience in data management and cloud engineering. Proven experience in at least 2-3 large-scale cloud implementations within industries such as Retail, Manufacturing, or Technology. Expertise in Azure Cloud, Azure Data Lake, Databricks, Teradata, and ETL technologies. Strong problem-solving skills with a focus on performance optimization and data quality. Ability to collaborate effectively with analysts, subject matter experts, and external partners.

Posted 5 days ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Mumbai

Work from Office

Job TitlePM LocationMumbai / Pune Skill- End to end KYC, Periodic Review, AML, and Due Diligence Shift Timings:APAC , EMEA , NAM Roles & Responsibilities The KYC Lifecycle Team performs detailed KYC checks on new and existing clients to fulfil KYC & AML regulatory requirements in multiple jurisdictions, allowing our clients to engage in a risk compliant manner. This role will include the understanding of KYC lifecycle and manage complete KYC process. In this role you will be liaising with our clients, front and middle office staff to obtain KYC documentation of existing customers as part of reviews, interfacing with internal stakeholders on any KYC lifecycle related queries and conducting research clients to obtain KYC data from public sources. You will also be screening entity names against sanctions, adverse news and politically exposed person, provide disposition and escalate due diligence findings to Financial Crimes Compliance Key responsibilities include: Maintain working knowledge of various internal processes including KOPs, Local Regulations & Guidelines. Publishing of MIS on status of the KYC Periodic Review to the Senior Management of the Bank (Both Regional & Local) Ensure that there are no Critical Audit Points as a result of Regulatory or Internal Audits. Rollout of any new KYC Policy / Regulations and also proper understanding of the same within the team and stakeholders. For the delivery of high standards of client service, at the same time ensuring that all internal (e.g. Risk) and external (e.g. Compliance) standards requirements are met in full. Ensure appropriate escalation policies exist and are followed. Work collectively with management, develop and maintain a motivated and professionally trained staff, ensuring appropriate capacity planning, adherence to and improvement in performance and quality standards and appropriate career development. Provide leadership support, guidance and coaching to the team. Ensure key stakeholders are kept informed of the progress and challenges and escalating issues where appropriate. Work in a high-pressure and time-sensitive environment. Perform quality checks to ensure that defined guidelines are adhered for excellent QA scores. Work as the process owner and ensure end to end management of all activities associated with the process. Ensure adherence to standards, procedures and also identify risk mitigates wherever there is a control issue. Qualification and Skills Bachelors Degree (B.Com, BBA, BBM, BCA) / Masters Degree (M.Com, MBA, PGDM). 6 to 9 years' experience in AML Compliance & KYC within the financial services industry with experience in Data Quality and Controls Work closely with other internal teams to ensure top of the line service to Clients. Prioritize tasks and ensure adherence to timelines for completion of activities Initiate and lead change management initiatives within the team. Ensure structured upward & downward communication Responsible for ensuring that cases are managed effectively and consistently in line with the agreed process; ensuring that all aspects of delivery are running effectively and if necessary, escalating issues. Liaises with multiple internal stakeholders to ensure the smooth delivery of KYC & AML services to clients. Supports the production of critical metrics and reporting which provide data related to department performance, risk quantification and stratification. Review complex KYC cases and ensure appropriate escalation to internal teams such as AML. You will be an individual contributor as a part of a team with a predetermined focused scope of work. Good experience in MS-Office applications like Excel, Word, Power point, Outlook etc

Posted 5 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Process Manager - GCP Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store,organize,and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical and Functional Skills: Bachelors Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 5 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

The Data Manager will Have a strong command over data wrangling and data story telling. Address and solve data quality issues either automated update or manually update information into KYCs sources systems or uplift the data into designated platforms. Source data from a variety of sources like core banking systems to combine, synthesise and analyse to improve data quality. Collaborate with stakeholders, SMEs to ensure that the in-scope issues are accurately rectified and meets business requirements. Good Knowledge and experience with tools for data analysis & data remediation (e.g. excel, KYC platforms, reporting tools Power BI, KNIME) Continuous Improvement & Change Understands, accepts and supports the need for change and adapts own behaviours to changing circumstances and provides input to change projects Problem Solving - Comprehensive understanding of a range of problem solving techniques Understanding depth & breadth of data - Some capabilities to source, join and derive insights from disparate data sources across several domain

Posted 5 days ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Hyderabad

Work from Office

SAP BODS with magnitude knowledge We are looking for a skilled SAP BODS professional with 6 to 7 years of experience. The ideal candidate will have a strong background in SAP BODS and excellent analytical skills. Roles and Responsibility Design, develop, and implement SAP BODS solutions to meet business requirements. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain technical documentation for SAP BODS projects. Troubleshoot and resolve complex technical issues related to SAP BODS. Provide training and support to end-users on SAP BODS applications. Ensure data quality and integrity by implementing data validation and testing procedures. Job Requirements Strong knowledge of SAP BODS architecture and design principles. Experience with SAP BODS development tools and technologies. Excellent problem-solving Skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with industry-standard data modeling techniques and tools.

Posted 5 days ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Immediate Openings on SAP MDG_Hyd/Bangalore_Contract Experience 6+ Skills SAP MDG Location Hyd Notice Period Immediate . Employment Type Contract Mode Work From Office EXP Range 7.5 to 12 Location Bangalore and Hyderabad SAP BW4 - Senior Consultant 8-10 years of experience in SAP MDG Minimum of 2 implementation experience in S4 HANA Data migration management Full life cycle experience of the data migration process, from strategy and planning, through to execution and go-live Experience in segregation of master, reference & transactional data, data quality, management of risks associated with data migration, extraction, transformation & migration activities Experience working with legacy system support teams

Posted 5 days ago

Apply

6.0 - 11.0 years

5 - 8 Lacs

Hyderabad

Hybrid

Experience: 6+ Years Skill: Informatica Power Center, MDM & IDQ Admin Location: Pan India Notice Period: Immediate . Employment Type: Contract Job Description Immediate Shift Time: 3:30pm - 1:30am Location: Anywhere in India (Should be flexible to work in Hybrid Model) Mandatory Skill: Informatica MDM & IDQ Job Description: 5+ years of experience in performing administrative activities for Informatica/Informatica MDM Experience in Installation, Configuration, Administration and Security of BI tools Experience in software upgrades, Implementation of Hot fixes, implementation of new software offerings and coordination of testing activities with project teams Experience in implementation of SSL and Different Authentications methods Design, Install, Configuration, and Administration of Informatica Platform v10 or higher (currently on v10.2) on Linux, Install experience with Informatica MDM 10.x Preferred Leads software upgrades, Implementation of Hot fixes, implementation of new software offerings and infrastructure, maintenance, and coordinates testing activities with project teams Researches and provides recommendations for capacity modifications, collaborates with PM to document tasks and update status Creates and maintains architecture diagrams, Informatica/Data Integration/Data Quality tools & UNIX troubleshooting and Automating Daily Tasks Informatica/Data Integration/Data Quality Tools Security Informatica MDM Platform administration and Integration support Coordinates patching and other infrastructure related activities with different teams Monitoring of servers and services

Posted 5 days ago

Apply

6.0 - 9.0 years

1 - 5 Lacs

Hyderabad

Hybrid

Immediate Openings for Business analyst Data analyst ( Fin Crime ) - PUNE - Contract to hire Experience: 6+ Years Skill: Business Analyst, Financial Crime. Location: PAN INDIA Notice Period: Immediate Employment Mode: Contract to hire Working Mode: Hybrid Job Description: Financial Crime knowledge is highly desirable but less critical to the role. Knowledge of managing customer data across a complex environment with multiple customer masters is critical Understanding the challenges of a distributed data environment, how to navigate it and how to effectively consolidate it Some experience in implementation planning (via solution design iterations) in relation to new complex data models Ability to communicate well with senior stakeholders, including influencing and adoption of ideas. Hands on DA - good at data systems, dq, data transformation, dig in and investigate Understand lineage, data quality, data criticality and governance Data model implementation strategy

Posted 5 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Job Title: Data Governance Specialist - Collibra - 5+ Years - Bangalore (Hybrid) Are you an experienced Data Governance Specialist with Collibra expertise? We are looking for a talented individual to join our team in Bangalore. In this role, youll be responsible for implementing data governance methodologies, managing metadata, and ensuring data quality across GBS systems. If you have a passion for data governance and are looking to work in a global organization, this is the opportunity for you. Location: Bangalore, India (Hybrid) Shift: 3:00 PM - 12:00 AM IST Cab: Available Notice Period: 30 days to 40 days Your Future Employer: Be part of a dynamic global organization where innovation meets excellence. Youll be joining a team that thrives on collaboration, creativity, and forward-thinking solutions. With a strong focus on quality and continuous improvement, this is your chance to contribute to transformative projects that make a real-world impact. Embrace an exciting career journey where your skills will not only grow but help shape the future of the industry, all while working with talented professionals from across the globe Responsibilities: Implementing data governance methodology, prioritizing data elements and defining business definitions. Loading metadata into Collibra and maintaining the data catalog. Defining and implementing data governance policies, standards, and processes. Mentoring team members and stakeholders on Collibra best practices. Performing data profiling to identify quality issues and conducting root cause analysis. Documenting technical attributes, business descriptions, and data lineage. Collaborating with cross-functional teams to ensure data governance practices are followed. Requirements: Bachelor's degree 3+ years of experience working with Collibra Proficiency in SQL and data analysis tools Strong understanding of data governance, data privacy, and security principles Experience with metadata management and foundational source systems Agile delivery experience (user stories, sprint planning, UAT) Excellent communication and collaboration skills Ability to stay updated on industry trends and technologies What's in it for you: Competitive compensation package Work in an innovative, quality-driven, and collaborative environment Opportunities for career growth and skill development Be part of a global organization with a strong presence in multiple countries

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies