Jobs
Interviews

2339 Data Validation Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

4 - 8 Lacs

Meerut

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Hapur

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Faridabad

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Ghaziabad

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Greater Noida

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Noida

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

15 - 19 Lacs

Bengaluru

Work from Office

We are looking for a skilled professional to join our team as an Operations & Strategy Role in Hevo Data, located in [location to be specified]. The ideal candidate will have 2-5 years of experience and a strong background in operations and strategy. Roles and Responsibility Develop and implement operational strategies to achieve business objectives. Analyze data to identify trends and areas for improvement. Collaborate with cross-functional teams to drive process improvements. Design and implement new processes and procedures to enhance efficiency. Monitor and report on key performance indicators to stakeholders. Identify and mitigate risks to ensure compliance with regulatory requirements. Job Requirements Strong understanding of operations and strategy principles. Excellent analytical and problem-solving skills. Ability to work collaboratively with cross-functional teams. Strong communication and interpersonal skills. Experience with data analysis and interpretation. Ability to adapt to changing priorities and deadlines.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Mumbai

Work from Office

You are a strategic thinker passionate about driving solutions in Data Annotation and Validation. You have found the right team. As a Data Analyst within our Asset Management Data Science team, you will be responsible for setting and improving our organizational objectives, and ensuring their consistent accomplishment. Job Responsibilities Work on data labeling tools and annotate data for machine learning models. Sift through structured and unstructured data; identify the right content and annotate with the right label. Develop comprehensive test plans and strategies for data science projects, including data validation, model testing, and performance evaluation. Collaborate with stakeholders, including data scientists, data engineers, and product managers. Conduct thorough data validation and verification processes to ensure data accuracy and consistency. Design and execute test cases for models, ensuring they meet performance and accuracy standards. Validate model outputs and conduct regression testing to ensure consistent results. Utilize tools like Snorkel, Datasaur, and Apptek for model performance monitoring, data labeling, and speech annotation. Develop and maintain automated testing scripts and tools to streamline the QA process. Implement continuous integration and continuous deployment (CI/CD) practices for data science projects. Transcribe verbatim audio recordings, single and multi-speaker of varying dialects and accents, and identify relevant keywords and sentiment labels. Build a thorough understanding of data annotation and labeling conventions and develop documentation/guidelines for stakeholders and business partners Required qualifications, capabilities, and skills At least 5 years of hands-on experience in data collection, analysis, or research. Proven experience in data quality assurance, data management, or a similar role. Experience in Python programming. Proficiency in data querying and validation using SQL, with experience in Snowflake . Experience in constructing dashboards to effectively visualize and communicate data insights. Experience with data annotation, labeling, entity disambiguation, and data enrichment. Familiarity with industry-standard annotation and labeling methods and tools like Label Studio, Snorkel, Datasaur, and Apptek. Familiarity with Machine learning and AI paradigms such as text classification, entity recognition, information retrieval. Creative and disruptive, loves embracing the challenge of rigorous testing to uncover vulnerabilities and enhance system robustness. Understanding of data governance principles and practices. Preferred qualifications, capabilities, and skills Strong financial knowledge is preferred. Familiarity with Machine learning and AI paradigms such as text classification, entity recognition, information retrieval. Strong financial knowledge is preferred. You are a strategic thinker passionate about driving solutions in Data Annotation and Validation. You have found the right team. As a Data Analyst within our Asset Management Data Science team, you will be responsible for setting and improving our organizational objectives, and ensuring their consistent accomplishment. Job Responsibilities Work on data labeling tools and annotate data for machine learning models. Sift through structured and unstructured data; identify the right content and annotate with the right label. Develop comprehensive test plans and strategies for data science projects, including data validation, model testing, and performance evaluation. Collaborate with stakeholders, including data scientists, data engineers, and product managers. Conduct thorough data validation and verification processes to ensure data accuracy and consistency. Design and execute test cases for models, ensuring they meet performance and accuracy standards. Validate model outputs and conduct regression testing to ensure consistent results. Utilize tools like Snorkel, Datasaur, and Apptek for model performance monitoring, data labeling, and speech annotation. Develop and maintain automated testing scripts and tools to streamline the QA process. Implement continuous integration and continuous deployment (CI/CD) practices for data science projects. Transcribe verbatim audio recordings, single and multi-speaker of varying dialects and accents, and identify relevant keywords and sentiment labels. Build a thorough understanding of data annotation and labeling conventions and develop documentation/guidelines for stakeholders and business partners Required qualifications, capabilities, and skills At least 5 years of hands-on experience in data collection, analysis, or research. Proven experience in data quality assurance, data management, or a similar role. Experience in Python programming. Proficiency in data querying and validation using SQL, with experience in Snowflake . Experience in constructing dashboards to effectively visualize and communicate data insights. Experience with data annotation, labeling, entity disambiguation, and data enrichment. Familiarity with industry-standard annotation and labeling methods and tools like Label Studio, Snorkel, Datasaur, and Apptek. Familiarity with Machine learning and AI paradigms such as text classification, entity recognition, information retrieval. Creative and disruptive, loves embracing the challenge of rigorous testing to uncover vulnerabilities and enhance system robustness. Understanding of data governance principles and practices. Preferred qualifications, capabilities, and skills Strong financial knowledge is preferred. Familiarity with Machine learning and AI paradigms such as text classification, entity recognition, information retrieval. Strong financial knowledge is preferred.

Posted 2 weeks ago

Apply

1.0 - 5.0 years

3 - 6 Lacs

Raipur

Work from Office

Roles and Responsibility Develop and implement comprehensive test plans to ensure high-quality software products. Collaborate with cross-functional teams to identify and prioritize testing requirements. Design and execute automated tests using various tools and technologies. Analyze and report defects found during testing, working with development teams to resolve issues. Participate in agile development methodologies, providing feedback on product quality and suggesting improvements. Stay up-to-date with industry trends and emerging technologies in QA engineering. Job Requirements Strong understanding of software testing principles, methodologies, and best practices. Experience with automation testing tools and technologies, such as Selenium or Appium. Excellent problem-solving skills, with the ability to analyze complex issues and provide creative solutions. Strong communication and collaboration skills, with experience working with distributed teams. Ability to work in an agile environment, prioritizing tasks and managing multiple projects simultaneously. Strong attention to detail, with a focus on delivering high-quality results.

Posted 2 weeks ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Kochi, Thrissur, Kozhikode

Work from Office

Data Gathering and Analysis: Collect and organize data from multiple sources, ensuring data quality and integrity Analyse large datasets to identify trends, patterns, and insights Reporting and Visualization: Prepare reports, dashboards, and data visualizations to present findings and recommendations to stakeholders effectively Data Quality Assurance: Implement data validation and cleansing processes to ensure accuracy and reliability of data Collaborate with IT Teams: Work closely with IT professionals to understand data infrastructure, data governance policies, and ensure data security and privacy standards are met Problem Solving: Use analytical skills and critical thinking to identify and solve complex business problems through data analysis Educational Qualification A bachelors degree in a relevant field such as computer science, information technology, mathematics, statistics, or a related discipline is typically required

Posted 2 weeks ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Requires substantial experience configuring and managing data archiving processes within SAP environments including hands-on experience in data archiving, including: configuring archiving objects executing data deletion optimising archiving processes Experience with ECC and S/4 HANA Understanding of data validation and compliance requirements. Basic understanding of ABAP code and debugging experience Excellent interpersonal, analytical and problem-solving skills.

Posted 2 weeks ago

Apply

0.0 - 1.0 years

0 Lacs

Coimbatore

Work from Office

Python Developer Intern - OneData Software Solutions Internship: 3 Months (First 2 Months Unpaid, 3rd Month Stipend Based on Performance) We are looking for a motivated and detail-oriented Python Developer Intern to join our development team. This internship is ideal for freshers who want to gain real-world experience in Django and REST API development. You ll work on backend systems that involve role-based access control, model relationships, and test-driven development. Build and maintain RESTful APIs using Django REST Framework Implement role-based permissions and access control logic Write unit tests to validate functionality and ensure code quality Collaborate with team members, follow structured version control using Git & GitHub Document the project clearly using README and requirements.txt Strong foundation in Python and Django Experience with Django REST Framework (DRF) Familiarity with database relationships and model design Understanding of RBAC (Role-Based Access Control) Ability to write and maintain automated test cases Proficiency with Git, GitHub, GitLab and clean code practices Good understanding of API structures, status codes, and data validation Bachelor s degree in B.E/B.Tech/MCA or a related field. Must be available for full-time onsite internship in Coimbatore Eager to learn and grow in a fast-paced environment. Our Culture We foster a collaborative, inclusive, and innovative culture where employees can thrive. We believe in empowering our team members to take ownership of their work and contribute to the company s success. Great Co-Workers Work with some of the best talent in the industry and build strong networks with them. Medical Insurance Enjoy benefits that support your overall well-being. Career Growth Unlock opportunities for advancement and leadership.knowledge to kickstart your career. At OneData , we re building a team of talented individuals who share our vision of creating innovative solutions that transform industries. We offer a dynamic work environment, opportunities for growth, and the chance to collaborate on

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Chennai

Work from Office

5+ Years of IT experience on Salesforce Service Cloud and Data Migration. Salesforce Sales Cloud / Service Cloud Data Validation Automation Flow/Approval Process Reports and Dashboard User Management Development Apex/Lightning Component etc Data Migration

Posted 2 weeks ago

Apply

10.0 - 11.0 years

35 - 40 Lacs

Hyderabad

Work from Office

Role Overview As a Specialist Data/AI Engineer QA at AT&T, you will be responsible for ensuring the quality, reliability, and performance of data pipelines, AI models, and analytics solutions. You will design and execute comprehensive testing strategies for data and AI systems, including validation of data integrity, model accuracy, and system scalability. Your role is critical to delivering robust, production-ready AI and data solutions that meet AT&T s high standards. Key Responsibilities Develop and implement QA frameworks, test plans, and automated testing scripts for data pipelines and AI/ML models. Validate data quality, consistency, and accuracy across ingestion, transformation, and storage processes. Test AI/ML model performance including accuracy, bias, robustness, and drift detection. Utilize cloud platforms (AWS, Azure, GCP) and modern data technologies (e.g., Snowflake, Databricks, Kafka) to manage large-scale data workflows. Collaborate with data engineers, data scientists, and product teams to identify test requirements and ensure comprehensive coverage. Perform regression, integration, system, and performance testing on data and AI workflows. Automate testing processes using appropriate tools and frameworks to enable continuous testing in CI/CD pipelines. Monitor production systems to detect issues proactively and support root cause analysis for defects or anomalies. Document test results, defects, and quality metrics, communicating findings to technical and non-technical stakeholders. Advocate for quality best practices and contribute to improving testing methodologies across the CDO. Stay current with industry trends and emerging tools in data engineering, AI, and QA automation. Qualifications Required: Bachelor s or Master s degree in Computer Science, Engineering, Data Science, or a related field. Experience in quality assurance or testing roles focused on data engineering, AI, or machine learning systems. Proficiency in programming and scripting languages such as Python, SQL, and experience with test automation frameworks. Strong understanding of data pipelines, ETL/ELT processes, and data validation techniques. Familiarity with machine learning concepts and model evaluation metrics. Experience with cloud platforms (AWS, Azure, GCP) and data platforms (Snowflake, Databricks) is preferred. Knowledge of CI/CD tools and integration of automated testing within deployment pipelines. Excellent analytical, problem-solving, and communication skills. Preferred: Experience with AI/ML model testing frameworks and bias/fairness testing. Familiarity with containerization (Docker) and orchestration (Kubernetes) environments. Understanding of data governance, compliance, and responsible AI principles. Experience with real-time data streaming and testing associated workflows. #DataEngineering Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City Job ID R-75360 Date posted 07/18/2025

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 11 Lacs

Pune

Work from Office

Key Responsibilities:Design, develop, and maintain advanced dashboards and reports in Tableau that provide insights into user engagement, product performance, and business metrics.Utilize Databricks to process, clean, and analyze large volumes of data efficiently.Develop complex SQL queries and optimize data pipelines for analytics workflows.Collaborate with cross-functional teams to define key metrics, data requirements, and analytical approaches.Conduct deep-dive analyses and present findings with clear data storytelling to influence product and business strategies.Automate routine reporting and data validation tasks to ensure accuracy and timeliness.Mentor junior analysts and promote best practices in data visualization and analytics.Stay current with latest trends and tools in data analytics, visualization, and big data platforms.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 5 Lacs

Chennai

Work from Office

Who are Inchcape At Inchcape, our vision is to have a connected world, in which our customers trade successfully and make better decisions in every port, everywhere. We use technology and our global network to help our partners connect to a smoother, smarter ocean. Inchcape combines its worldwide infrastructure with local expertise through our global network of over 250 proprietary offices, across 70 countries and a team of more than 3,000 committed professionals. Our diverse global customer base includes owners and charterers in the oil, cruise, container, and bulk commodity sectors as well as naval, government and intergovernmental organisations. We have an ambitious growth model and a career here is certainly going to be a rewarding one that will allow you to bring your skills experience. We embrace change and are open to new thinking and pushing for positive change in our industry. What you ll do: Primary Responsibilities Analyse vendor pricing trends and prepare actionable reports to support procurement cost optimisation Evaluate vendor rebate structures and create reports to support increased rebate realisation Monitor port call volumes and revenue data to identify opportunities for commercial improvement and efficiency Prepare monthly, quarterly, and annual reports for key government service stakeholders, ensuring accuracy and timely submission Track and report vendor DA (Disbursement Account) submission timelines and generate insights to improve submission compliance Work with internal stakeholders to gather new reporting requirements and translate them into clear documentation Prepare and manage Business Requirement Documents (BRDs) with technical and functional details for stakeholder sign-off Develop and maintain Alteryx workflows to automate regular reporting processes Conduct quality checks on data inputs and outputs to ensure consistency, accuracy, and alignment with business requirements Perform User Acceptance Testing (UAT) for newly developed reports and dashboards with internal stakeholders Build and update Tableau dashboards based on validated data and user feedback Maintain change trackers and documentation for ongoing and completed reporting projects Create and maintain Confluence pages with all project-related documents Coordinate with Data Architects and the IT team for design inputs and data structure improvements Initiate and manage Change Requests (CRs) post sign-offs and oversee report/dashboard migration to production Regularly update Jira or equivalent tracking tools with tasks and progress updates Support the management team with prioritised reporting requirements across business segments Identify and recommend automation opportunities for recurring reports and dashboards Additional Responsibilities Conduct ad hoc analysis and reporting for various functions within Government Services using SQL or Alteryx Monitor and troubleshoot Alteryx workflow performance; manage scheduling and reruns as needed Manage data archiving activities and ensure regular updates to cloud storage platforms (e.g., AWS S3) Assist with monthly data updates, such as FX rates, to ensure accuracy in financial and operational reports Deliver weekly and monthly business-as-usual (BAU) reports for internal stakeholders Prepare presentations and trackers in MS PowerPoint and MS Excel based on recurring operational KPIs Support testing and coordination activities related to reporting system upgrades or integrations Assist the wider reporting and analytics team with cross-functional projects as needed Manage user access to reporting platforms and oversee job schedules on tools such as Tableau Server Skills and Qualifications Educational Background: Bachelor s or Master s degree in Computer Science, Engineering, Economics, Statistics, or related disciplines Experience: 2 5 years of experience in data analytics, reporting, or similar roles Strong written and verbal communication skills with the ability to convey complex data clearly Proactive approach to identifying inconsistencies, errors, or anomalies in data sets Problem-solving mindset with a focus on delivering practical insights to improve performance Ability to manage tasks independently while collaborating effectively with cross-functional teams Technical Proficiency Experience with Tableau and/or other data visualisation platforms Strong Excel skills including pivot tables, advanced formulae, and data validation techniques Proficient in SQL, Alteryx, and data manipulation languages Basic understanding of scripting languages such as Python or R (desirable) Familiarity with tools such as Jira, Confluence, AWS S3, or similar platforms is a plus Key Contributions Translate data analysis into clear insights that guide commercial and operational decisions Automate standard reports and reduce manual effort to enhance reporting efficiency Collaborate with business and functional leads to develop data-driven strategies for cost and performance improvement Maintain robust documentation to ensure knowledge continuity and compliance #LI-MB1

Posted 2 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Hyderabad

Work from Office

Vistex Human Resources Specialist will assist the Human Resources (HR) department through various transactional tasks. These tasks include entering employee information into appropriate systems, verifying employment and unemployment details, and managing the day-to-day operations of the Global HRIS/ERP system. The position operates with close supervision and follows specific directions from higher-ups. Responsibilities: Handle employee data changes, including new hires, terminations, transfers, promotions, and compensation adjustments. Experience in Employee life cycle management. Stay updated on local employment laws and regulations and ensure compliance in employment contracts, working hours, and leave policies. Collaborate with relevant stakeholders, such as HR business partners and payroll, to ensure timely and accurate resolution of cases and employee grievances. Oversee the accurate and timely processing of HR transactions, ensuring compliance with policies, procedures, and legal requirements. Assists Managers with inquiries regarding Employee Status Change Notices. Keeping track of various HR records as advised from time to time Identify process optimization and automation opportunities within HR administration, leveraging HRIS and other available tools. Ensures the accuracy of the HRIS Systems. Propose and implement improvements to streamline workflows, reduce manual tasks, and enhance the employee and manager experience. Completes special projects as requested. Handle Data analytics and data validation. Other miscellaneous HR activities Facilitate inductions etc Handle and support employee engagement events

Posted 2 weeks ago

Apply

5.0 - 6.0 years

7 - 12 Lacs

Chennai

Work from Office

Role Description Share point developer with good experience in Pages and Workflow. Excellent Communication skills. Offshore resource who specializes in Sharepoint (i.e. Pages, workflow) development, etc.). 5-6 yrs of experience.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Project description We are seeking an experienced UAT-BVT Tester to support a high-impact project within the financial services domain. The project involves complex system integrations and focuses on customer onboarding, AML/KYC processes, and portal interfaces. The resource will play a key role in ensuring business readiness and quality through end-to-end User Acceptance Testing (UAT) and Business Verification Testing (BVT). Responsibilities Prepare detailed business scenarios and test cases aligned with UAT requirements. Execute UAT and coordinate with business stakeholders for timely execution and defect resolution. Prepare UAT documentation, including test plans, daily status updates, and test completion reports. Participate in integration testing and validate workflows across multiple systems. Liaise with product owners, SMEs, and development teams to clarify requirements and ensure coverage. Report defects and track them to closure using standard tools and processes. Support post-UAT verification activities, including production sanity/BVT where applicable. Skills Must have 4+ Years of experience in a similar role Proven hands-on experience in Writing business scenarios and test cases for UAT. Executing and coordinating UAT cycles with business users. Creating UAT test plans, test summary reports, and closure documentation. Integration testing experience across multi-system environments. Strong communication and stakeholder coordination skills. BVT (Business Verification Testing) experience or familiarity with BVT concepts. Nice to have Domain knowledge in financial services, especially within Banking, Insurance, or Superannuation. Experience with onboarding or AML/KYC processes for corporate customers. Familiarity with customer master data and customer portal projects. Exposure to systems like FenX or similar AML/KYC data capture tools. Exposure to test automation techniques/tools for data creation and validation.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

9 - 14 Lacs

Chennai

Work from Office

Position Overview: We are looking for a detail-oriented and experienced Senior Test Engineer with 5 to 7 years of experience in ETL testing. The ideal candidate will have expertise in SQL, functional testing, and a solid understanding of Data Warehouse concepts. If you are passionate about ensuring data quality and integrity, and thrive in a collaborative environment, we would love to hear from you! Key Responsibilities: Design and execute ETL test plans, test cases, and test scripts to validate data transformations and data quality. Design, develop, and execute functional test cases. Conduct functional testing to ensure that ETL processes and data pipelines meet business requirements. Collaborate with data engineers, developers, and business analysts to understand data requirements and specifications. Utilize SQL to perform data validation and ensure accuracy and completeness of data in the Data Warehouse. Identify, document, and track defects using JIRA, ensuring timely resolution. Create and maintain comprehensive documentation of testing processes, methodologies, and results. Participate in code reviews and provide feedback to ensure best practices in ETL development. Stay updated on industry trends and advancements in ETL testing and Data Warehouse technologies. Technical Skills : Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 5 to 7 years of experience in ETL testing and data quality assurance. Strong expertise in SQL for data validation and manipulation. Knowledge of Data Warehouse concepts, architectures, and best practices. Experience with functional testing methodologies and tools. Familiarity with JIRA for issue tracking and test case management. Excellent analytical and problem-solving skills. Solid understanding of the financial domain, with experience in testing financial applications. Strong communication skills with the ability to work collaboratively in a team environment. Experience with Agile or DevOps methodologies. Experience in Oracle DB or SQl Server tools. Experience in Azure synapse tool or any cloud based tools for Pipeline and ETL testing. Certification in QA or software testing.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Role Description : As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills : Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 weeks ago

Apply

4.0 - 5.0 years

2 - 6 Lacs

Mumbai

Work from Office

1. Factory & Warehouse Hygiene and Pest Control Ensure strict implementation of hygiene and sanitation protocols across the factory and warehouse. Oversee pest control measures and maintain compliance with food safety standards. Monitor and verify routine cleaning and deep cleaning (ALC) schedules in production and packaging areas. 2. Production & Packaging Quality Assurance Ensure adherence to quality parameters throughout the production and packaging processes. Conduct regular verification of quality control checks, including for online processes, gifting, and customized orders. Monitor CCP and OPRP to ensure process safety and consistency. 3. Material Verification & Warehouse Monitoring Monitor warehouse operations to ensure material storage, handling, and FIFO compliance. Perform Pre-Dispatch Inspections and ensure timely clearance of on-hold materials. Oversee Raw & packaging material checks and verify. 4. Laboratory Operations & Product Testing Supervise the preparation and use of chemicals for testing purposes. Conduct and validate physical, chemical, and shelf-life testing as per defined protocols. Coordinate with external laboratories for product and environmental testing (e.g., NI samples, water, air, swab, FSSR). 5. Training, Audits & Documentation Conduct internal training on personal hygiene, GMP, GHP, food safety, etc... and documentation practices. Prepare and maintain comprehensive quality documentation, including in-process, laboratory, and shift records. Lead internal and external audits in alignment with FSSC 22000 and BRCGS standards. Ensure timely closure of audit non-conformities and implementation of CAPA. 6. Regulatory & Interdepartmental Coordination Liaise with cross-functional teams to resolve quality-related issues effectively and promptly. Ensure compliance with applicable statutory and regulatory food safety requirements. Address customer quality complaints and drive root cause analysis and CAPA implementation. 7. Personnel Hygiene & Compliance Monitoring Ensure strict monitoring of personal hygiene practices across all staff and production areas. Validate and monitor hygiene compliance through regular audits. 8. Calibration & Equipment Validation Ensure timely calibration and maintenance of all quality and lab equipment. Maintain calibration records and ensure accuracy of instruments used in quality & production assessments. 9. Labelling & Artwork Compliance Ensure all product labels comply with FSSAI regulations (Nutritional information, Ingredients, Allergen Declaration etc...) and other applicable statutory guidelines. Review packaging designs and artwork for accuracy, legal compliance, and brand consistency. Coordinate with internal teams (Regulatory, Marketing, R&D) and external design agencies to finalize artworks. Ensure timely approval and closure of artworks to avoid production delays. Conduct online verification of labelling elements such as batch coding, MRP, expiry date, ingredient declaration, and allergen statements during production.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Pune

Work from Office

Piller Soft Technology is looking for Lead Data Engineer to join our dynamic team and embark on a rewarding career journey Designing and developing data pipelines: Lead data engineers are responsible for designing and developing data pipelines that move data from various sources to storage and processing systems. Building and maintaining data infrastructure: Lead data engineers are responsible for building and maintaining data infrastructure, such as data warehouses, data lakes, and data marts. Ensuring data quality and integrity: Lead data engineers are responsible for ensuring data quality and integrity, by setting up data validation processes and implementing data quality checks. Managing data storage and retrieval: Lead data engineers are responsible for managing data storage and retrieval, by designing and implementing data storage systems, such as NoSQL databases or Hadoop clusters. Developing and maintaining data models: Lead data engineers are responsible for developing and maintaining data models, such as data dictionaries and entity-relationship diagrams, to ensure consistency in data architecture. Managing data security and privacy: Lead data engineers are responsible for managing data security and privacy, by implementing security measures, such as access controls and encryption, to protect sensitive data. Leading and managing a team: Lead data engineers may be responsible for leading and managing a team of data engineers, providing guidance and support for their work.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

11 - 16 Lacs

Bengaluru

Work from Office

The missions of a senior functional expert are varied and hinge upon the strengthening of regulatory and accounting requirements related to the supervision and monitoring of risk models. In this context, you will be responsible for conducting internal model reviews (validation of the modeling, backtesting, etc.) that have been developed by the Groups modeling entities. Your main missions will be: - End to end responsibility of modeling validation missions, based on the planning and framework - Interact with the modeling entities - Analyze and test methods by using both technical knowledge and critical thinking. - Conduct quantitative reviews (statistics). - Be vigilant in the analysis of the regulatory compliance, robustness and performance of these models. - Contribute to the composition of a validation report in order to communicate the conclusions of the review mission. - Contribute and present the results of the review at the Models Committee - Ensure adequate documentation and archiving of the analyses carried out. -Mentoring Junior team members The functional expert works on many different topics such as: retail or wholesale credit risk (PD models, CCF models, LGD models, stress tests), market risk models (VaR/SVaR/FRTB, EEPE, CVA, SIMM, IRC/CRM...), models developed under the IFRS 9 framework, models developed to comply with US regulatory requirements. Profile required Ideal candidate should be well versed in credit risk model development, validation and maintenance of models (PD, LGD and EAD) for wholesale and retail credit portfolio of the bank as per regulatory guidelines. Exposure to banking book and understanding of trading book products and knowledge on BASEL/IFRS guidelines is highly desirable. Candidate should have excellent business communication skills. Educational Requirements: Post-graduation degree in quantitative discipline(Statistics, Economics, Mathematics engineering) from Tier I/II colleges. Additional certification in machine learning techniques or estimation of credit risk parameters will be preferred. Role Responsibility The ongoing monitoring of the model is a task that must be done in all phases of the model lifecycle (development, implementation, use). In order to track and measure the efficiency and adequacy of models, the model monitor conducts continuous analysis and controls as an early warning both initially at implementation (for new models) and regularly as a part of the models ongoing monitoring. For the purpose of these tests, the model monitor is responsible to: - Backtest re-calibrate each model designed and developed by the business, hence a thorough understanding of model development under Basel IFRS norms is critical. - Choose adequate model outcome analysis techniques such as: o Model estimates vs realized values (e.g. back-testing for some models); o Stability of model outcomes; o Benchmarking: model output vs output generated by comparable models or applications; o Sensitivity analysis to test robustness. - Analyze the model output and the related components (if applicable); - Model assumptions and limitations validity; - Results of benchmarking and sensitivity analysis; - Accuracy of models characteristics;(ROC/AUC, KS statistics, accuracy ratio, Gini coefficient etc) - Monitor over time in order to follow up trends and detect deviations; - Establish thresholds and action plan for major deviations; - Report this analysis to the different model stakeholders. - Implement a governance to monitor the corrective actions Furthermore, as part of the model ongoing monitoring phase, the model monitor should abide by the group standards on ongoing monitoring that establish guidelines on performance assessment processes including type, scope and range of tests and appropriateness of responses to any problems that may appear. Technical Skills: Regulatory risk model (IRB, IFRS9) model validation, monitoring, development (good to have) using SAS, R. Initiation to machine learning model validation. Functional Skills: Knowledge of Global regulatory Topics BASEL II/III IFRS 9 Understanding of risk management and risk quantification processes Understanding of forms of risk, viz. credit, market, operational, model etc. Behavioral Aspects: -Result Orientation -Client Focus -Contribution to Strategy -Cooperation -Team Player

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies