Jobs
Interviews

3436 Data Quality Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 4.0 years

3 - 8 Lacs

Tumkur

Work from Office

About Quest Alliance: At Quest Alliance, we transform learning ecosystems through education technology, capacity building, and collaboration to build 21st-century skills for learners and facilitators. We are a not-for-profit trust that focuses on research-led innovation and advocacy in the field of teaching and learning. We engage with educators, civil society, government institutions, and corporate organizations to demonstrate and enable scalable and replicable solutions in school education and vocational training. At Quest, you will get the opportunity to apply your skills and contribute to addressing issues around quality education and skills training. The organization gives you the space to learn and grow in a fun and engaging environment. We have an eclectic group of people working at Quest drawn from diverse disciplines, including Education, Technology, Design, Youth Development, and Business. About the role : We are hiring a Program Associate to support the implementation of our Schools Program in Tumkuru. This role will focus on effective program delivery, mentoring master trainers and teachers, facilitating student-led innovation events like Ideathons and Hackathons, and managing district-level stakeholder engagement. The ideal candidate will bring experience in teaching, strong training and communication skills, and a deep understanding of working with government systems and education programs. Key Responsibilities: Mentor and support Master Trainers (MTs) and teachers in implementing Ideathons and Hackathons Plan and facilitate school and district-level innovation events and share-outs Support MTs with content queries and technical escalations via chatbot and WhatsApp Conduct regular school visits and phone check-ins to observe and improve classroom engagement Organize MT review calls and district-level department updates Coordinate and support cascade training delivery with MTs, Program Coordinators, and DIETs Attend and assist in in-person training across blocks/districts Maintain training data and documentation Liaison with District and Block Education Officials, Principals, and School Heads Conduct orientations, review meetings, and ensure buy-in for program delivery Track and report program progress in line with the M&E framework Support data quality checks, documentation of best practices, and field impact stories Coordinate assessments (baseline/endline) and maintain chatbot engagement records Requirements Degree in Social Work (MSW/BSW) or equivalent social development background 3 4 years experience in teaching , mentoring, or education program delivery Proven experience working with government stakeholders Strong skills in training delivery , communication , and report writing Willingness to travel frequently within districts Fluency in Kannada (spoken and written) Brownie Points: Prior exposure to or interest in STEM mindset , computational thinking , or critical thinking approaches in education Benefits Salary: The pay band for the position starts at Rs. 32,000/- per month (cost to company) ( The salary offered will be commensurate with the experience and expertise of the candidate)

Posted 1 week ago

Apply

6.0 - 11.0 years

25 - 30 Lacs

Bengaluru

Work from Office

We are seeking an experienced Senior Data Engineer to join our data team. As a Data Engineer at ThoughtSpot, you will be responsible for designing, building, and maintaining the data infrastructure that powers our analytics and drives data-driven decision-making for leadership. You will work closely with business teams to ensure our data systems are robust, scalable, and efficient. We have a rapidly expanding list of happy customers who love our product, and were growing to serve even more. What youll do: Design, develop, and maintain scalable data pipelines to process large volumes of data from various sources. Working closely with our business teams to process & curate analytics ready data. Ensure data quality and consistency through rigorous testing and validation processes. Monitor and troubleshoot data pipeline performance and resolve any issues. What you bring: 6+ years of experience in data engineering, building data infra and pipelines. Experience building and maintaining large data pipelines, data infrastructure. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases/warehouses. Experience with ETL tools like Hevo, Matillion etc. Experience with data analytics products. Thoughtspot experience is good to have. Experience with cloud services such as AWS, GCP, Azure etc. Knowledge of data warehousing concepts and experience with EDW like Databricks, Snowflake or Redshift. Proficiency in programming languages such as Python and data processing libraries such as Pandas etc. Understanding of data governance, data quality and security best practices. Knowledge of development good practices such as testing, code reviews and git. Ability to work independently and coordinate with different stakeholders. You love building and leading exceptional teams in a fast-paced, entrepreneurial environment. You have a strong bias for action and being resourceful Bring amazing problem-solving skills and an ability to identify, quantify, debug, and remove bottlenecks and functional issues Great communication skills, both verbal and written, and an interest in working with a diverse set of peers and customers Alignment with ThoughtSpot Values What makes ThoughtSpot a great place to work? ThoughtSpot is the experience layer of the modern data stack, leading the industry with our AI-powered analytics and natural language search. We hire people with unique identities, backgrounds, and perspectives this balance-for-the-better philosophy is key to our success. When paired with our culture of Selfless Excellence and our drive for continuous improvement (2% done), ThoughtSpot cultivates a respectful culture that pushes norms to create world-class products. If you re excited by the opportunity to work with some of the brightest minds in the business and make your mark on a truly innovative company, we invite you to read more about our mission, and apply to the role that s right for you. ThoughtSpot for All Building a diverse and inclusive team isnt just the right thing to do for our people, its the right thing to do for our business. We know we can t solve complex data problems with a single perspective. It takes many voices, experiences, and areas of expertise to deliver the innovative solutions our customers need. At ThoughtSpot, we continually celebrate the diverse communities that individuals cultivate to empower every Spotter to bring their whole authentic self to work. We re committed to being real and continuously learning when it comes to equality, equity, and creating space for underrepresented groups to thrive. Research shows that in order to apply for a job, women feel they need to meet 100% of the criteria while men usually apply after meeting 60%. Regardless of how you identify, if you believe you can do the job and are a good match, we encourage you to apply.

Posted 1 week ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Industrial Post-Doctoral Fellow-HPLC Method Development (Protein Biologics) About Mynvax Mynvax is a clinical-stage vaccine biotechnology company headquartered in Bangalore, India, developing novel, recombinant, and thermostable vaccines against respiratory viral infections, including influenza and RSV. With a pipeline of promising candidates and multiple ongoing collaborations, Mynvax offers a unique opportunity to work at the cutting edge of vaccine development. Role Design, develop, and optimize HPLC/UPLC methods (e.g., SEC, RP-HPLC, IEX, HIC) for protein characterization, including purity, aggregation, charge variants, and stability. Conduct protein analysis using HPLC and orthogonal techniques for in-process and final DS samples. Interpret results, troubleshooting analytical challenges, and ensure data quality and integrity. Prepare analytical protocols, reports, and SOPs, and contribute to method development and qualification. Collaborate with upstream, downstream, and formulation development teams. Support regulatory submissions with high-quality analytical documentation. Required Qualifications Ph.D. in Biophysics, Biochemistry, Analytical Chemistry, Biotechnology, or a related field. Hands-on experience in HPLC method development for proteins (during Ph.D. or postdoc). Basic understanding of protein structure, behavior, and physicochemical properties. Familiarity with HPLC data acquisition and analysis software (e.g., Openlab CDS, Empower, ChemStation). Desirable Skills Knowledge of ICH Q2/WHO guidelines for analytical method validation. Experience with protein biologics, vaccine antigens, or biosimilar analytics. Familiarity with other techniques such as CE, SDS-PAGE, Western blotting, or ELISA. What Mynvax offers: A stimulating industrial research environment with real-world impact. Exposure to state-of-the-art technologies and multidisciplinary collaboration. Mentorship and professional development support from experienced scientists. Full-time salary and benefits include health coverage, generous leave package, and statutory entitlements. Location: Bangalore, India Company: Mynvax Private Limited Position Type: Full-Time | Fixed Term (12-24 months, extendable) Start Date: Immediate Compensation: Competitive salary with full-time employee benefits How to apply: Email your CV and a brief cover letter to careers@mynvax.com Subject line: "Post-Doctoral Fellow-HPLC Application" Apply on LinkedIn: Post-Doctoral Fellow-HPLC Application

Posted 1 week ago

Apply

5.0 - 8.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Job_Description":" JobDescription: Data Scientist JOBSUMMARY We areseeking an innovative Data Scientist with 5-8 years of professionalexperience to join our SmartFM product team. This role will be pivotal inextracting actionable insights from complex operational data, leveragingadvanced machine learning, deep learning, agentic workflows, and Large LanguageModels (LLMs) to optimize building operations. The ideal candidate willtransform raw alarms and notifications into predictive models and intelligentrecommendations that enhance facility efficiency and decision-making. ROLES ANDRESPONSIBILITIES Analyze large, complex datasets from various building devices (alarms, notifications, sensor data) to identify patterns, anomalies, and opportunities for operational optimization. Design, develop, and deploy machine learning and deep learning models to predict equipment failures, optimize energy consumption, and identify unusual operational behavior. Develop and implement agentic workflows to automate decision-making processes and trigger intelligent actions based on real-time data insights. Explore and integrate Large Language Models (LLMs) to interpret unstructured data (e.g., maintenance logs, technician notes) and generate natural language insights or automate reporting. Collaborate with Data Engineers to define data requirements, ensure data quality, and optimize data pipelines for machine learning applications. Work closely with Software Engineers to integrate developed models and intelligent agents into the React frontend and Node.js backend of the SmartFM platform. Evaluate and monitor the performance of deployed models, implementing strategies for continuous improvement and retraining. Communicate complex analytical findings and model insights clearly and concisely to technical and non-technical stakeholders. Stay abreast of the latest advancements in AI, ML, Deep Learning, Agentic AI, and LLMs, assessing their applicability to facility management challenges and advocating for their adoption. Contribute to the strategic roadmap for AI/ML capabilities within the SmartFM product. REQUIREDTECHNICAL SKILLS AND EXPERIENCE 5-8 years of professional experience in Data Science, Machine Learning Engineering, or a related analytical role. Strong proficiency in Python and its data science ecosystem (Pandas, NumPy, Scikit-learn, TensorFlow, Keras, PyTorch). Proven experience in developing and deploying Machine Learning models for predictive analytics, anomaly detection, and classification problems. Hands-on experience with Deep Learning frameworks and architectures for time series analysis, pattern recognition, or natural language processing. Demonstrated experience in designing and implementing Agentic Workflows or intelligent automation solutions. Practical experience working with Large Language Models (LLMs) , including fine-tuning, prompt engineering, or integrating LLM APIs for specific use cases. Solid understanding of statistical modeling, experimental design, and A/B testing. Experience with querying and analyzing data from MongoDB and working with streaming data from Kafka . Familiarity with data ingestion processes, ideally involving IBM StreamSets . Experience with cloud-based ML platforms and services (e.g., AWS SageMaker, Azure ML, Google AI Platform). ADDITIONALQUALIFICATIONS Proven expertise in written and verbal communication, adept at simplifying complex technical concepts for both technical and non-technical audiences. Strong problem-solving and analytical skills with a passion for extracting insights from data. Experienced in collaborating and communicating seamlessly with diverse technology roles, including data engineering, software development, and product management. Highly motivated to acquire new skills, explore emerging technologies, and stay updated on the latest trends in AI/ML and business needs. Domain knowledge in facility management, IoT, or building automation is a significant advantage. EDUCATIONREQUIREMENTS / EXPERIENCE Masters or Ph.D. degree in Computer Science, Data Science, Artificial Intelligence, Statistics, Mathematics, or a related quantitative field. ","

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Job_Description":" JobDescription: Quality Engineer (Data) JOBSUMMARY We areseeking a highly skilled Quality Engineer with 5-10 years of professionalexperience to ensure the integrity, reliability, and performance of our datapipelines and AI/ML solutions within the SmartFM platform. The ideal candidatewill be responsible for defining and implementing comprehensive qualityassurance strategies for data ingestion, transformation, storage, and themachine learning models that generate insights from alarms and notificationsreceived from various building devices. This role is crucial in deliveringhigh-quality, trustworthy data and intelligent recommendations to optimizefacility operations. ROLES ANDRESPONSIBILITIES Develop and implement end-to-end quality assurance strategies and test plans for data pipelines, data transformations, and machine learning models within the SmartFM platform. Design, develop, and execute test cases for data ingestion processes, ensuring data completeness, consistency, and accuracy from various sources, especially those flowing through IBM StreamSets and Kafka. Perform rigorous data validation and quality checks on data stored in MongoDB, including schema validation, data integrity checks, and performance testing of data retrieval. Collaborate closely with Data Engineers to ensure the robustness and scalability of data pipelines and to identify and resolve data quality issues at their source. Work with Data Scientists to validate the performance, accuracy, fairness, and robustness of Machine Learning, Deep Learning, Agentic Workflows, and LLM-based models. This includes testing model predictions, evaluating metrics, and identifying potential biases. Implement automated testing frameworks for data quality, pipeline validation, and model performance monitoring. Monitor production data pipelines and deployed models for data drift, concept drift, and performance degradation, setting up appropriate alerts and reporting mechanisms. Participate in code reviews for data engineering and data science components, ensuring adherence to quality standards and best practices. Document testing procedures, test results, and data quality metrics, providing clear and actionable insights to cross-functional teams. Stay updated with the latest trends and tools in data quality assurance, big data testing, and MLOps, advocating for continuous improvement in our quality processes. REQUIREDTECHNICAL SKILLS AND EXPERIENCE 5-10 years of professional experience in Quality Assurance, with a significant focus on data quality, big data testing, or ML model testing. Strong proficiency in SQL for complex data validation, querying, and analysis across large datasets. Hands-on experience with data pipeline technologies like IBM StreamSets and Apache Kafka. Proven experience in testing and validating data stored in MongoDB or similar NoSQL databases. Proficiency in Python for scripting, test automation, and data validation. Familiarity with Machine Learning and Deep Learning concepts, including model evaluation metrics, bias detection, and performance testing. Understanding of Agentic Workflows and LLMs from a testing perspective, including prompt validation and output quality assessment. Experience with cloud platforms (Azure, AWS, or GCP) and their data/ML services. Knowledge of automated testing frameworks and tools relevant to data and ML (e.g., Pytest, Great Expectations, Deepchecks). Familiarity with Node.js and React environments to understand system integration points. ADDITIONALQUALIFICATIONS Demonstrated expertise in written and verbal communication, adept at simplifying complex technical concepts related to data quality and model performance for diverse audiences. Exceptional problem-solving and analytical skills with a keen eye for detail in data. Experienced in collaborating seamlessly with Data Engineers, Data Scientists, Software Engineers, and Product Managers. Highly motivated to acquire new skills, explore emerging technologies in data quality and AI/ML testing, and stay updated on the latest industry best practices. Domain knowledge in facility management, IoT, or building automation is a plus. EDUCATIONREQUIREMENTS / EXPERIENCE Bachelors (BE / BTech) / Masters degree (MS/MTech) in Computer Science, Information Systems, Engineering, Statistics, or a related field. ","

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Description: The Data & Analytics Team is seeking a Data Engineer with a hybrid skillset in data integration and application development. This role is crucial for designing, engineering, governing, and improving our entire Data Platform, which serves customers, partners, and employees through self-service access. You'll demonstrate expertise in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles Requirements: • 5+ years of experience with system/data integration, development, or implementation of enterprise and/or cloud software. • Strong experience with Web APIs (RESTful and SOAP). • Strong experience setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). • Demonstrated proficiency with Python. • Strong experience with data wrangling and query authoring in SQL and NoSQL environments for both structured and unstructured data. • Experience in a cloud-based computing environment, specifically GCP. • Expertise in documenting Business Requirement, Functional & Technical documentation. • Expertise in writing Unit & Functional Test Cases, Test Scripts & Run books. • Expertise in incident management systems like Jira, Service Now etc. • Working knowledge of Agile Software development methodology. • Strong organizational and troubleshooting skills with attention to detail. • Strong analytical ability, judgment, and problem analysis techniques. • Excellent interpersonal skills with the ability to work effectively in a cross-functional team. Job Responsibilities: • Lead system/data integration, development, or implementation efforts for enterprise and/or cloud software. • Design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. • Perform extensive data wrangling and author complex queries in both SQL and NoSQL environments for structured and unstructured data. • Develop and integrate applications, leveraging strong proficiency in Python and Web APIs (RESTful and SOAP). • Provide operational support for the data platform and applications, including incident management. • Create comprehensive Business Requirement, Functional, and Technical documentation. • Develop Unit & Functional Test Cases, Test Scripts, and Run Books to ensure solution quality. • Manage incidents effectively using systems like Jira, Service Now, etc. • Prepare change management packages and implementation plans for migrations across different environments. • Actively participate in Enterprise Risk Management Processes. • Work within an Agile Software Development methodology, contributing to team success. • Collaborate effectively within cross-functional teams. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Chennai

Work from Office

In This Role, Your Responsibilities Will Be: 1. User Management One of the primary responsibilities of an SFDC specialist is managing users. This includes creating and maintaining user accounts , assigning roles and permissions, and ensuring that each user has the appropriate level of access to perform their job functions effectively. Remove licenses for users who have left the organization or no longer require the Salesforce license. 2. Enhancement and Validation The SFDC Specialist is responsible for suggesting system enhancements in the Salesforce platform to meet business needs. This includes validation rules for cases, routing behavior, opportunities, and leads to input the correct details. 3. Data Management Effective data management is critical to the success of Salesforce implementation. An SFDC specialist is required to ensure the business maintains data quality by performing regular data cleanups, deduplication, and data imports/exports. They ensure that data is accurate, up-to-date, and properly organized, essential for generating reliable reports and analytics. 4. Service Management To monitor the service and order queues, ensure the system behaves and works as normal, and raise an SFDC ticket if there are any issues detected. In charge of APAC service and order queue, assign cases to the right BU/owner to handle. To perform testing if there is any enhancement or system change before rolling it out to the team member. Provide constructive feedback to the Salesforce development team. 5. System Maintenance To monitor system performance and proactively address any potential problems before they impact users. 6. Reporting and Analytics The SFDC Specialist is also responsible for creating and managing reports and dashboards to provide actionable insights to the business. Able to design customized reports and dashboards that help users track key performance indicators (KPIs), monitor business metrics, and make data-driven decisions. By delivering valuable analytics, they enable organizations to measure success and identify areas for improvement. 7. Training and Support User adoption is a critical factor in the success of any Salesforce implementation. The SFDC Specialist is responsible for developing and delivering training programs to help new users understand and effectively utilize the platform. They create training materials, user guides, and documentation and provide ongoing support to address user questions and issues. Who You Are: The SFDC specialist needs to be involved in long-term projects that require strategic planning and execution. These projects may include System Upgrades: Plan and implement system upgrades to take advantage of new features and capabilities. Integration: Integrating Salesforce with other business systems and applications to create a seamless flow of information. Customization Projects: Leading customization projects to tailor the Salesforce platform to meet evolving business needs. Example: Manufacturing service + Customer 360 Minimum of 5-8 years of team handling proven experience Knowledge of various verticals like process, commercial, and oil and gas industries. End-to End Shipment Process Expertise in Fourth Shift or SAP is an added advantage . For This Role, You Will Need: On a day-to-day basis, SFDC specialists are required to perform various tasks to ensure the smooth operation of the Salesforce platform. These tasks may include: User Support: Assisting users with the issues encountered. Data Management: Performing data imports and exports, conducting data cleanups, and ensuring data accuracy. Service Management: Assign service and order cases to the BU/owner. Opportunity Funnel: All past due opportunities are updated. Customization: Creating and modifying custom objects, fields, and page layouts to meet business requirements. System Monitoring: Monitoring system performance, identifying potential issues, and implementing solutions. Training: Developing and delivering training sessions and creating user documentation. Reporting: Generating and distributing reports and dashboards to provide insights to business. Process Optimization: Evaluating and optimizing business processes to improve efficiency and effectiveness. Preferred Qualifications That Set You Apart: Bachelors degree or any equivalent in Electrical, Electronics, instrumentation, and Automation Engineering or any related field Proficient in MS Office Suites such as MS Excel, Word, and PowerPoint. Our Culture & Commitment to You: At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives because we know that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. Whether through mentorship, training, or leadership opportunities, we invest in your success so you can make a lasting impact. We believe diverse teams working together are key to driving growth and delivering business results. We recognize the importance of employee well-being. We prioritize providing competitive benefits plans, a variety of medical insurance plans, an Employee Assistance Program, employee resource groups, recognition, and much more. Our culture offers flexible time-off plans, including paid parental leave (maternal and paternal), vacation, and holiday leave.

Posted 1 week ago

Apply

2.0 - 4.0 years

11 - 12 Lacs

Bengaluru

Work from Office

Req ID: 331215 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AEP Developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). About the Role: We are seeking a highly skilled and experienced Senior Adobe Experience Platform (AEP) Developer to join our growing team. In this role, you will play a critical part in Support & maintenance, design, development, and implementation of innovative customer data solutions within the AEP ecosystem. You will be responsible for building and maintaining robust data pipelines, integrating data from various sources, and developing custom solutions to meet the unique needs of our business. Key Responsibilities: AEP Platform Expertise: Deep understanding of the AEP suite, Experience Data Model (XDM), Data Science Workspace, and other relevant modules. Proficient in AEP APIs, Web SDKs, and integrations with other MarTech platforms (Adobe Target, CJA, AJO, Adobe Campaign etc.). Experience with AEP data ingestion, transformation, and activation. Strong understanding of data modeling principles and best practices within the AEP ecosystem. Data Engineering & Development: Design, develop, and maintain high-quality data pipelines and integrations using AEP and other relevant technologies. High level knowledge and understanding to develop and implement custom solutions within the AEP environment using scripting languages (e.g., JavaScript, Python) and other relevant tools. Troubleshoot and resolve data quality issues and performance bottlenecks. Ensure data accuracy, consistency, and security across all stages of the data lifecycle. Customer Data Solutions: Collaborate with cross-functional teams (e.g., marketing, product, data science) to understand the issues and support to fix problems. Support and maintenance of developed data-driven solutions to improve customer experience, personalize marketing campaigns, and drive business growth. Analyze data trends and provide insights to inform business decisions. Project Management & Collaboration: Contribute to the planning and execution of AEP projects, ensuring timely delivery and adherence to project timelines and budgets. Effectively communicate technical concepts to both technical and non-technical audiences. Collaborate with team members and stakeholders to ensure successful project outcomes. Stay Updated: Stay abreast of the latest advancements in AEP and related technologies. Continuously learn and expand your knowledge of data management, data science, and customer experience. Qualifications: Education: Bachelor s degree in computer science, Engineering, or a related field (or equivalent experience). Experience: Overall IT experience of 5+ years with 3-4 years of hands-on experience with Adobe Experience Platform (AEP). Technical Skills: 3+ Strong proficiency in JavaScript, or other relevant programming languages. 3 years of experience with RESTful APIs, JSON, and XML. 3+ years of experience with data warehousing, data modeling, and data quality best practices. 3+ years of experience in Tag management system like Adobe Launch 2+ years of experience working with WebSDK Experience of Adobe Analytics is a plus. Knowledge and experience on leveraging Python libraries and tools for data cleaning and analysis is a plus Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus. Soft Skills: Excellent analytical and problem-solving skills. Strong communication, interpersonal, and collaboration skills. Ability to work independently and as part of a team. Detail-oriented and results-driven. Strong organizational and time-management skills. Bonus Points: Experience with other Adobe Marketing Cloud solutions (e.g., Adobe Analytics, Adobe Target). Experience with Agile development methodologies. Experience with data visualization tools (e.g., Tableau, Power BI). Experience with data governance and compliance (e.g., GDPR, CCPA). Understanding of Real-time Customer Data Platform (RT-CDP)

Posted 1 week ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Technical Skills Required: ETL Concepts: Strong understanding of Extract, Transform, Load (ETL) processes. Ability to design, develop, and maintain robust ETL pipelines. Database Fundamentals: Proficiency in working with relational databases (e.g., MySQL, PostgreSQL, Oracle, or MS SQL Server). Knowledge of database design and optimization techniques. Basic Data Visualization: Ability to create simple dashboards or reports using visualization tools (e.g., Tableau, Power BI, or similar). Query Optimization: Expertise in writing efficient, optimized queries to handle large datasets. Testing and Documentation: Experience in validating data accuracy and integrity through rigorous testing. Ability to document data workflows, processes, and technical specifications clearly. Key Responsibilities: Data Engineering Tasks: Design, develop, and implement scalable data pipelines to support business needs. Ensure data quality and integrity through testing and monitoring. Optimize ETL processes for performance and reliability. Database Management: Manage and maintain databases, ensuring high availability and security. Troubleshoot database-related issues and optimize performance. Collaboration: Work closely with data analysts, data scientists, and other stakeholders to understand and deliver on data requirements. Provide support for data-related technical issues and propose solutions. Documentation and Reporting: Create and maintain comprehensive documentation for data workflows and technical processes. Develop simple reports or dashboards to visualize key metrics and trends. Learning and Adapting: Stay updated with new tools, technologies, and methodologies in data engineering. Adapt quickly to new challenges and project requirements. Additional Requirements: Strong communication skills, both written and verbal. Analytical mindset with the ability to solve complex data problems. Quick learner and willingness to adopt new tools and technologies as needed. Flexibility to work in shifts, if required. Preferred Skills (Not Mandatory): Experience with cloud platforms (e.g., AWS, Azure, or GCP). Familiarity with big data technologies such as Hadoop or Spark. Basic understanding of machine learning concepts and data science workflows.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Noida

Work from Office

Job Description We are looking for a seasoned Data Engineer with extensive experience in designing and implementing data pipelines using Medallion Architecture, Azure Databricks, and Snowflake. The ideal candidate will be responsible for building scalable ETL pipelines, optimizing data flows, and ensuring data quality for large-scale data platforms. Key Responsibilities: Design, develop, and optimize data pipelines following Medallion Architecture (Bronze, Silver, Gold layers). Implement and maintain ETL pipelines using Databricks and Python (multi-threading and multi-processing). Leverage Snowflake for data warehousing, including schema design, data loading, and performance tuning. This also includes experience with Linux, Docker, Anaconda, Pandas, PySpark, Apache Hive and Iceberg, Trino, and Prefect. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver robust data solutions. Develop data models and manage data lakes for structured and unstructured data. Implement data governance and security best practices. Monitor and troubleshoot data pipelines for performance and reliability. Stay up-to-date with industry trends and best practices in data engineering and cloud technologies. Minimum Qualification B.Tech/B.E. (Computer Science/IT/Electronics) MCA Computer diploma in development with 3+ years of experience compulsory

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

The incumbent will join the Fees & Expense Management department, where they will be responsible for managing a team of 10 FTEs analyzing and reconciling various fees related to Brokerage, Clearing, and Exchange services in line with client service-level agreements. The role involves leading the team responsible to understand and calculate various fees and handling tasks related to invoice processing for a wide range of over the counter (OTC) and Listed Derivatives products. A significant part of the role involves client management, escalation management, collaborating with external vendors such as brokers, clearing firms, agent banks and custodians. The role also involves optimization of costs and processes, reducing invoicing errors, improving data quality, and automating manual tasks which calls for collaboration with number of internal teams such as Projects, Development, Quality Assurance and Business Analysts.

Posted 1 week ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Mumbai

Work from Office

Job Title: Project Manager Job Code: 9923 Country: IN City: Mumbai Skill Category: India CMT Description: Role description The candidate will be a member of the international change team responsible for delivery and support of core transformation initiatives. Strong skills in business requirements & product coverage analysis, data mapping across platforms, support solution design and technical implementation deliverables. The ability to help prototype, visualise, reengineer and automate processes in cross functional teams would be key. The candidate must have strong analytical and problem solving skills and be able to work in a dynamic environment to rapidly produce robust solutions. The candidate will have to liaise with key stakeholders across Finance, MiddleOffice, Risk and Technology, have the ability to work independently to see initiatives through from inception to delivery and have a strong attention to detail. Key objectives critical to success Strong analytical and communication skills with the initiative to identify and solve problems Ability to define requirements, solutions, scope and work across endtoend implementation lifecycle Ability to proactively reengineer processes and deliver digital solutions to improve department efficiency Demonstrate confidence in engaging and presenting complex solutions to senior individuals Manage relationships with stakeholders in Finance, Risk and Technology functions across all regions Skills, experience, qualifications and knowledge required Financial services experience, ideally in a transformational, middle office, collateral, product control role Good understanding of data preparation, modelling and visual data delivery through dashboards Experience with implementing data management capability centralisation of data stores, data sourcing, data quality and controls definition and simplification An understanding of profiling current state, define future state, consolidate data insights and present these succinctly to senior stakeholders Ability to be proactive and use initiative to improve and reengineer processes and systems and help support legacy solutions e.g. a centralised adjustment mechanism and op model Strong communication skills to accurately elicit requirements from stakeholders, present project updates to senior management and provide training to project end users Ability to take ownership of end to end project management from inception to delivery. Proficient in effectively documenting complex business and technical requirements and workflows using a variety of tools and ability to present these to business and technology stakeholders Experience of Agile process and tools, e.g. JIRA, Confluence etc Coordinate UAT testing for new process implementation with an ability to identify risks involved and communicate with respective teams for resolution Willingness to help others across the department in improving their digital skills through training sessions or workshops Desirable skills and experience Understanding of collateral management, P&L reporting, balance sheet substantiation, and trade lifecycle controls Some experience of using digital tools would be beneficial that would support key platform data migrations e.g. Python, Alteryx, VBA Basic understanding of equities, fixed income and derivatives products

Posted 1 week ago

Apply

1.0 - 2.0 years

3 - 4 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

REQUIRED SKILLS: (Need Majority; 1-2+ Years) Serve as a customer service contact for data reporting for Wisconsin schools and districts. Create, oversee, and monitor requests, incidents, and resolutions using ticketing software. Work with school districts to help them understand how to use WISEdata and WISEdash for data reviews. Openness to presenting training sessions and materials. Experience preferred. Monitor and support district reporting progress and ensure data errors are resolved. Assist WISEdata product owner with business process analysis and improvements. Conduct data health checks using the WISEdata portal and WISEdash visualizations and download tools. CONTRACT OVERVIEW MUST BE CURRENT WI RESIDENT OR RELOCATING NICE TO HAVE SKILLS: Understanding of API technologies. Knowledge of data warehousing and reporting. Experience working in education software systems at school/district/state level. Provide on-site or virtual training. INTERVIEW PROCESS: Microsoft Teams Video On & Audio On DESCRIPTION OF ROLE: The goal of this position is to increase support to Wisconsin s schools and districts for state reporting tasks as well as provide training and presentations related to several data applications. The position performs direct WISEdata customer support services for school and district clients (LEAs) by responding to CRM tickets, emails, and phone calls. This position will document, track, and monitor support requests to ensure timely resolution. It will also serve as the primary trainer for training sessions/videos, demonstrations, conferences, and workshops organized by the Customer Services team or external stakeholders. The employee will follow the Customer Service Framework and standard practices for effective customer service. Key Responsibilities: Customer Service Help Desk Responsibilities (60%): o Respond to customer requests via CRM, email, and phone. o Track and document tickets and support issues. o Support schools/districts in WISEdata data submissions and review processes. o Contact districts proactively if data support is needed. o Monitor data quality, review reports before data snapshots. o Help identify improvements to state reporting through business analysis. o Provide support to ensure data issues and errors are being resolved. Training Responsibilities (35%): o Serve as primary trainer for internal and external WISEdata-related sessions. o Collaborate with Technical Writer on documentation and FAQs. o Create and update tutorials, videos, and training materials. o Present at in-person or virtual conferences and workshops. o Provide feedback to leadership on client support issues. Professional Development & Other Duties (5%): o Stay current on application changes and industry practices. o Attend professional learning sessions and vendor user groups. CONTRACT OVERVIEW MUST BE CURRENT WI RESIDENT OR RELOCATING o Contribute to team meetings and processes. Other skills that might be valuable to the role: Strong interpersonal and customer service communication. Decision-making in complex situations. Ability to interpret business logic and technical requirements. Oral and written communication effectiveness. Ability to work independently and collaboratively. Familiarity with Microsoft Office, Microsoft Dynamics CRM, Google Apps. Knowledge of student information system (SIS) software used in Wisconsin. Knowledge of DPI tools including WISEdata Portal, WISEid, WISEdash, Ed-Fi Credential Application. Must be able to travel by car to meetings outside Madison.

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Chennai, Bengaluru

Work from Office

Job Summary What you need to know about the role This role focuses on building robust data pipelines to deliver reliable insights that drive strategic decisions. You ll enable advanced fraud risk analytics and empower leadership with data driven solutions. Meet our team Join PayPal s Global Fraud Protection team, a dynamic and mission-critical group dedicated to safeguarding our platform and customers from a wide range of risks including identity fraud, account takeovers, stolen financial information, and credit-related threats. This team plays a vital role in protecting PayPal s bottom line, enabling secure global growth, and ensuring a seamless and trustworthy customer experience. Job Description Your way to impact As a Data Engineer, you will design and optimize data systems that power strategic insights and decision-making across the organization. You ll lead initiatives to build scalable pipelines and infrastructure, enabling business stakeholders to access high-quality, actionable data. Your day to day Lead the design and development of complex data pipelines for data collection and processing. Ensure data quality and consistency through sophisticated validation and cleansing processes. Implement advanced data transformation techniques to prepare data for analysis. Collaborate with cross-functional teams to understand data requirements and provide innovative solutions. Optimize data engineering processes for performance, scalability, and reliability. What do you need to bring - Strong command of SQL , including complex queries, optimization, and analytical functions. Proficiency in Python for data engineering tasks and scripting. Proven expertise in big data processing frameworks such as Apache Spark, Flink, or Beam . Deep understanding of data modeling , ETL/ELT frameworks , and data warehousing principles. Experience with cloud platforms Google Cloud (preferred) . Familiarity with containerization and CI/CD practices (e.g., Docker, Kubernetes, GitHub Actions). Good understanding of data security and governance. Exposure to real-time streaming architectures and event-driven systems is a plus. Ability to mentor junior engineers and lead technical discussions. ** We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please dont hesitate to apply. Preferred Qualification Proven expertise in big data processing frameworks such as Apache Spark, Flink, or Beam . Strong command of SQL , including complex queries, optimization, and analytical functions. Proficiency in Python for data engineering tasks and scripting. Deep understanding of data modeling , ETL/ELT frameworks , and data warehousing principles. Experience with cloud platforms Google Cloud ( preferred ) . Familiarity with containerization and CI/CD practices (e.g., Docker, Kubernetes, GitHub Actions). Good understanding of data security and governance . Exposure to real-time streaming architectures and event-driven systems is a plus. Ability to mentor junior engineers and lead technical discussions. Subsidiary PayPal Travel Percent 0 For the majority of employees, PayPals balanced hybrid work model offers 3 days in the office for effective in-person collaboration and 2 days at your choice of either the PayPal office or your home workspace, ensuring that you equally have the benefits and conveniences of both locations. Our Benefits We have great benefits including a flexible work environment, employee shares options, health and life insurance and more. To learn more about our benefits please visit https//www.paypalbenefits.com . Who We Are Click Here to learn more about our culture and community. Commitment to Diversity and Inclusion PayPal provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, pregnancy, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state, or local law. In addition, PayPal will provide reasonable accommodations for qualified individuals with disabilities. If you are unable to submit an application because of incompatible assistive technology or a disability, please contact us at talentaccommodations@paypal.com . Belonging at PayPal Our employees are central to advancing our mission, and we strive to create an environment where everyone can do their best work with a sense of purpose and belonging. Belonging at PayPal means creating a workplace with a sense of acceptance and security where all employees feel included and valued. We are proud to have a diverse workforce reflective of the merchants, consumers, and communities that we serve, and we continue to take tangible actions to cultivate inclusivity and belonging at PayPal. Any general requests for consideration of your skills, please Join our Talent Community . We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don t hesitate to apply.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Noida

Work from Office

Position Overview Here at ShyftLabs, we are searching for an experienced Data Scientist who can derive performance improvement and cost efficiency in our product through a deep understanding of the ML and infra system, and provide a data driven insight and scientific solution Job Responsibilities: Research, design, and develop innovative generative AI models and applications. Collaborate with cross-functional teams to identify opportunities for AI-driven solutions. Train and fine-tune AI models on large datasets to achieve optimal performance. Optimize AI models for deployment in production environments. Stay up-to-date with the latest advancements in AI and machine learning. Collaborate with data scientists and engineers to ensure data quality and accessibility. Design, implement, and optimize machine learning algorithms for tasks like classification, prediction, and clustering. Develop and maintain robust AI infrastructure. Document technical designs, decisions, and processes, and communicate progress and results to stakeholders Work with cross-functional teams to integrate AI/ML models into production-level applications Basic Qualifications: Masters degree in a quantitative discipline or equivalent. 5+ years minimum professional experience. Distinctive problem-solving skills, good at articulating product questions, pulling data from large datasets, and using statistics to arrive at a recommendation. Excellent verbal and written communication skills, with the ability to present information and analysis results effectively. Ability to build positive relationships within ShyftLabs and with our stakeholders, and work effectively with cross-functional partners in a global company. Statistics : must have strong knowledge and experience in experimental design, hypothesis testing, and various statistical analysis techniques such as regression or linear models. Machine Learning : must have a deep understanding of ML algorithms (i.e., deep learning, random forest, gradient boosted trees, k-means clustering, etc.) and their development, validation, and evaluation. Programming : experience with Python or other scripting languages and database language (e.g., SQL) or data manipulation (e.g., Pandas).

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

We are looking for a Manager - Data Intelligence, who will work to deliver Data Driven Solutions by working with multi-functional domain experts to deeply understand business context and key business opportunities, as well as collaborate with other teams to aid the analytics process of evaluating data and discovering insights with the purpose of making decisions that improve business outcomes. You will be responsible for managing a team of dedicated business analysts, ensuring high-quality standards, and aligning with the business objectives and vision. You should possess a solid foundation in data science/analytics, program management, and leadership skills. What youll Do Lead and mentor a team to ensure customer success through quality in Development, Project Management, and Solution Delivery. Collaborate with stakeholders to understand business needs and formulate data-driven strategies. Translate business challenges into actionable data initiatives aligned with the company strategy. Effectively convey complex data science/analytic concepts to non-technical stakeholders. Analyze large datasets, extract relevant information, and identify patterns and trends to support decision-making process. Collaborate with other teams to gather and preprocess data from various sources, ensuring data accuracy and integrity. Generate actionable insights and recommendations based on data analysis to optimize processes, reduce costs, and improve user experience. Present reports, findings, and recommendations to executive leadership Drive data quality, governance, and compliance with organization security standards. Build multi-functional relationships with CAO/Controllership, and IT organization. Stay updated with the latest developments in data science and machine learning techniques. What you need to succeed 10+ years of industry work experience including 8+ years of experience in technical roles involving Data Science/Analytics, Machine Learning or Statistics. Hands-on experience with large scale data, developing innovative data solutions. Lead and inspire team with positivity and optimism, creating a culture of high expectations and achievement. Ability to align efforts of disparate groups to achieve a common goal. Confident ability to lead, collaborate, triage, and make decisions in a fast-paced, changing environment. Passion for problem solving and ability to think through problems creatively and systematically, to prioritize them, and to identify and evaluate alternative solutions. Outstanding interpersonal, verbal, and written communication and presentation skills. Python and Python based ML and Data Science frameworks. Extensive experience with BI Platforms (i.e., Power BI) .

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

The Group You ll Be A Part Of In the Global Products Group, we are dedicated to excellence in the design and engineering of Lams etch and deposition products. We drive innovation to ensure our cutting-edge solutions are helping to solve the biggest challenges in the semiconductor industry. The Impact You ll Make Designs, develops, analyzes, troubleshoots and provides technical skills during research and/or product development. What You ll Do Who We re Looking For Minimum of 5 years of related experience with a Bachelor s degree; or 3 years and a Master s degree; or a PhD without experience; or equivalent work experience. Develop and manage data pipelines, ensure data quality, and implement data solutions in a hybrid work environment (Cloud/Onprem) Data Transform , Clean and integration Load transformed data from varied sources and in varied form, into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications Experience in Azure Data Engineering with expertise in SQL, Azure Data Factory, Databricks, Python/Java/Scala and PySpark Hands on experience with data processing framework like Apache Spark, Apache kafka etc Understand data ware house/Lakehouse and dimension modelling The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.) Preferred Qualifications Bachelor s degree; or 3 years and a Master s degree; or a PhD without experience; or equivalent work experience. Our Commitment We believe it is important for every person to feel valued, included, and empowered to achieve their full potential. By bringing unique individuals and viewpoints together, we achieve extraordinary results. Lam Research ("Lam" or the "Company") is an equal opportunity employer. Lam is committed to and reaffirms support of equal opportunity in employment and non-discrimination in employment policies, practices and procedures on the basis of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex (including pregnancy, childbirth and related medical conditions), gender, gender identity, gender expression, age, sexual orientation, or military and veteran status or any other category protected by applicable federal, state, or local laws. It is the Companys intention to comply with all applicable laws and regulations. Company policy prohibits unlawful discrimination against applicants or employees. Lam offers a variety of work location models based on the needs of each role. Our hybrid roles combine the benefits of on-site collaboration with colleagues and the flexibility to work remotely and fall into two categories - On-site Flex and Virtual Flex. On-site Flex you ll work 3+ days per week on-site at a Lam or customer/supplier location, with the opportunity to work remotely for the balance of the week. Virtual Flex you ll work 1-2 days per week on-site at a Lam or customer/supplier location, and remotely the rest of the time.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Design and create compelling data visualizations, dashboards, and reports that provide actionable insights to support decision-making.Hands on experience in writing complex SQL queries and creating the stored procedure to create SSRS paginated reporting.Good understanding of Reporting.Working closely with data engineers and data analysts.Continuously optimize existing reports, ensuring performance, accuracy, and responsiveness, and addressing any data quality issues.SkillMinimum Experience - 3 to 8 YearsT-SQL/PL-SQL, SSRS Paginated Report, Sap BI , Power BIGood CommunicationStrong Aptitude Qualifications Power BI Key Features of Power BI Data Integration Techniques Data Refresh in Power BI Data governance and security in Power BI Active and Inactive Relationship Filters in Power BI Function in Power BI DAX CALCULATE SUMX AVERAGEX Time Intelligence Functions (ex : YTD) Filter Functions Text Functions Logical Functions (ex: AND) Date and Time Functions(ex: Month) SSRS Key Components of SSRS Paginated Reports Creating Parameters in SSRS Best Practices for SSRS Paginated Reports Best Practices for Parameters Change the sequence of report parameters in SSRS SQL DML & DDL Primary key, Unique key, foreign key Types of Joins Date & Aggregate function String functions Set Operators Windows function CTE Temp table in SQL Local Global Performance tunning Sample query based on above topics Query to identify and remove data redundancy in table Number of records based on joins

Posted 1 week ago

Apply

3.0 - 10.0 years

5 - 12 Lacs

Kolkata

Work from Office

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. Those in artificial intelligence and machine learning at PwC will focus on developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. Your work will involve designing and optimising algorithms, models, and systems to enable intelligent decision-making and automation. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Position responsibilities and expectations Designing and building analytical /DL/ ML algorithms using Python, R and other statistical tools. Strong data representation and lucid presentation (of analysis/modelling output) using Python, R Markdown, Power Point, Excel etc. Ability to learn new scripting language or analytics platform. Technical Skills required (must have) HandsOn Exposure to Generative AI (Design, development of GenAI application in production) Strong understanding of RAG, Vector Database, Lang Chain and multimodal AI applications. Strong understanding of deploying and optimizing AI application in production. Strong knowledge of statistical and data mining techniques like Linear & Logistic Regression analysis, Decision trees, Bagging, Boosting, Time Series and Non-parametric analysis. Strong knowledge of DL & Neural Network Architectures (CNN, RNN, LSTM, Transformers etc.) Strong knowledge of SQL and R/Python and experience with distribute data/computing tools/IDEs. Experience in advanced Text Analytics (NLP, NLU, NLG). Strong hands-on experience of end-to-end statistical model development and implementation Understanding of LLMOps, ML Ops for scalable ML development. Basic understanding of DevOps and deployment of models into production (PyTorch, TensorFlow etc.). Expert level proficiency algorithm building languages like SQL, R and Python and data visualization tools like Shiny, Qlik, Power BI etc. Exposure to Cloud Platform (Azure or AWS or GCP) technologies and services like Azure AI/ Sage maker/Vertex AI, Auto ML, Azure Index, Azure Functions, OCR, OpenAI, storage, scaling etc. Technical Skills required (Any one or more) Experience in video/ image analytics (Computer Vision) Experience in IoT/ machine logs data analysis Exposure to data analytics platforms like Domino Data Lab, c3.ai, H2O, Alteryx or KNIME Expertise in Cloud analytics platforms (Azure, AWS or Google) Experience in Process Mining with expertise in Celonis or other tools Proven capability in using Generative AI services like OpenAI, Google (Gemini) Understanding of Agentic AI Framework (Lang Graph, Auto gen etc.) Understanding of fine-tuning for pre-trained models like GPT, LLaMA, Claude etc. using LoRA, QLoRA and PEFT technique. Proven capability in building customized models from open-source distributions like Llama, Stable Diffusion Mandatory skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Preferred skill sets: AI chatbots, Data structures, GenAI object-oriented programming, IDE, API, LLM Prompts, Streamlit Years of experience required: 3-10 Years Education qualification: BE, B. Tech, M. Tech, M. Stat, Ph.D., M.Sc. (Stats / Maths) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Generative AI Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, AI Implementation, Analytical Thinking, C++ Programming Language, Communication, Complex Data Analysis, Creativity, Data Analysis, Data Infrastructure, Data Integration, Data Modeling, Data Pipeline, Data Quality, Deep Learning, Embracing Change, Emotional Regulation, Empathy, GPU Programming, Inclusion, Intellectual Curiosity, Java (Programming Language), Learning Agility, Machine Learning {+ 25 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

2.0 - 5.0 years

8 - 14 Lacs

Chennai

Work from Office

ETL Test Automation Engineer Chennai (WFO) - Immediate Joiner ONLY Role : ETL Test Automation Engineer to design, implement, and maintain test automation frameworks for real-time streaming ETL pipelines in a cloud based environment. Required Experience : 6-7 Year Mandatory in ETL Automation Testing Job Type : Work From Office (WFO)- Hybrid Mode Job Location : CHENNAI -Work From Office Working Hours : Normal working hours (Monday to Friday, ) Additional Information : Start Date : Immediate-Immediate Joiners Only Mandatory Background Verification : Mandatory through a third-party verification process. Required Experience : 6-7 Yr Mandatory in ETL Automation Testing Please Note : We are looking for Immediate Joiner Only Skills : - Strong knowledge of ETL Testing with a focus on streaming data pipelines (Apache Flink, Kafka). - Experience with database testing and validation (PostgreSQL, NoSQL). - Proficiency in Java for writing test automation scripts. - Hands-on experience with automation scripting for ETL workflows, data validation, and transformation checks. - Experience in performance testing and optimizing ETL jobs to handle large-scale data streams. - Proficient in writing and executing complex database queries for data validation and reconciliation. - Experience with CI/CD tools (Jenkins, GitHub Actions, Azure DevOps) for integrating test automation. - Familiarity with cloud-based ETL testing in Azure. - Ability to troubleshoot data flow and transformation issues in real-time. Nice to Have : - Experience with Karate DSL, NiFi and CLI-based testing tools. - Experience in developing self-healing mechanisms for data integrity issues. - Hands-on experience with automated data drift detection. - Knowledge of data observability tools. - Familiarity with containerization (Docker, Kubernetes) and Infrastructure as Code (Terraform, Ansible) for ETL deployment. - Experience with testing message queues (Kafka, Pulsar, RabbitMQ, etc.). - Domain knowledge of banking and financial institutions and/or large enterprise IT environment will be considered a strong asset Next Steps : If you're excited about this opportunity and meet the requirements, we'd love to hear from you! To apply, please reply with the following details along with your updated resume : 1. Total experience 2. Current salary 3. Expected salary 4. Notice period- How Soon you can join after selection 5 Are you Immediate Joiner?-Yes/No 6. Current location 7. Are you available for the Hybrid work setup mentioned in CHENNAI? (Yes/No) 8. Do you have a minimum of 6-7 years of hands-on experience in ETL Automation Testing? (Yes/No) 9 .How much experience do you have in ETL Automation Testing? 10. Are you open to work from in Day Shift (IST) at the mentioned Job location? (Yes/No) 11. Are you Open to do Work From Office from CHENNAI Location?-Yes/No

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.

Posted 1 week ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Mumbai

Work from Office

We are looking for a highly skilled and experienced Senior Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical and problem-solving skills. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and implement process improvements to increase efficiency and productivity. Analyze complex data sets to inform business decisions and drive growth. Provide expert-level support for data analysis and reporting. Identify and mitigate risks associated with data quality and integrity. Develop and maintain technical documentation for processes and procedures. Job Requirements Strong understanding of IT Services & Consulting industry trends and technologies. Excellent analytical and problem-solving skills, with attention to detail. Ability to work collaboratively in a fast-paced environment. Strong communication and interpersonal skills, with the ability to present complex ideas simply. Experience with data analysis and reporting tools, such as Excel or SQL. Ability to adapt to changing priorities and deadlines in a dynamic environment. About Company eClerx Services Ltd. is a leading provider of IT Services & Consulting solutions, committed to delivering exceptional results and exceeding client expectations.

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Mumbai

Work from Office

We are looking for a highly skilled Data Scientist with 7 to 10 years of experience to join our team as a Senior Process Manager in eClerx Services Ltd., located in Mumbai. Roles and Responsibility Develop and implement data-driven solutions to enhance business processes. Collaborate with cross-functional teams to identify areas for improvement and optimize workflows. Design and maintain databases and data systems to support business intelligence initiatives. Analyze complex data sets to inform strategic decisions and drive business growth. Develop and maintain reports and dashboards to track key performance indicators. Ensure data quality and integrity by implementing data validation and testing procedures. Job Requirements Strong understanding of data analysis, machine learning, and statistical modeling techniques. Experience with data visualization tools such as Tableau or Power BI. Excellent problem-solving skills and ability to work in a fast-paced environment. Strong communication and collaboration skills to work effectively with stakeholders. Ability to design and implement process improvements to increase efficiency and productivity. Strong knowledge of data management principles and practices.

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Chennai

Work from Office

Oracle Master Data Management Role Purpose Device Launch Readiness Team in Client's Xfinity Mobile under Technology & Product Wireless Technologies & New Business, manages master data for Mobile Device, Mobile Device Accessories, Packaging and Xfinity home products. Person responsible for this position will help in this process of SKU Lifecycle Management. Core Responsibilities Manage Device, Accessories Master data Write complex SQL to query large data platforms for analysis Perform queries and create anew datasets Analyze and package data to create / update records Clean data, parse, and makeavailable for groups of users Deep diveinto datato understand business drivers/problems Update Jira for completed activities and report to users/manager Support during prod and stage migration GeneralSkillsets: 3-5 years of experience in RDBMS Working experience in Mobile Device / Service domain Knowledge of mobile business acronyms Advanced Excel skills including macros, VLOOKUP, formula accuracy Other Expectations: Understand our Comcast Operating Principles; make them the guidelines for how you do your job. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Drive results and growth. Mandatory Skills: Oracle Master Data Management - MDM. Experience3-5 Years.

Posted 1 week ago

Apply

2.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Work with Us. Change the World. At AECOM, we're delivering a better world. Whether improving your commute, keeping the lights on, providing access to clean water, or transforming skylines, our work helps people and communities thrive. We are the world's trusted infrastructure consulting firm, partnering with clients to solve the world’s most complex challenges and build legacies for future generations. There has never been a better time to be at AECOM. With accelerating infrastructure investment worldwide, our services are in great demand. We invite you to bring your bold ideas and big dreams and become part of a global team of over 50,000 planners, designers, engineers, scientists, digital innovators, program and construction managers and other professionals delivering projects that create a positive and tangible impact around the world. We're one global team driven by our common purpose to deliver a better world. Join us. AECOM is seeking a Graduate Environmental Data Specialist with 2+ years of experience to support our enterprise environmental data management system (EarthSoft EQuIS). The ideal candidate will have a strong understanding of environmental data and terminology, good communication skills, and the ability to collaborate with both technical and non-technical stakeholders. This position will offer a hybrid work arrangement to include both office and remote work schedules and will be based from our office located in Bengaluru, India. This role includes, but is not limited to, the following activities: Role and Responsibilities: The ideal candidate will be able to understand requests from environmental subject matter experts. Be a good communicator able to share new functions and features with the users and have a good understanding of environmental data and environmental data terminology. Works on issues of diverse scope where analysis of situation or data requires evaluation of a variety of factors, including an understanding of current business trends. Prepare and update environmental associated reports sound in understanding environmental data, transforming, and analyzing large and diversified environmental datasets. Ability to translate environmental problems through digital and data solutions. Commitment to data quality at all levels and scales. Experience in developing custom reports and user-requested queries and views on various platforms of the desired skill set. Responsive to client (user) requests. Excellent communication skills Provide technical support to field sampling teams and act as a liaison between the project staff, analytical laboratory, data validator, and GIS analysts. Research state and federal regulations necessary to manage action levels or clean-up criteria. Professional qualification & Experience desired Bachelor’s degree in environmental/civil/chemical engineering or science in a related discipline (or similar subject) desirable with a required focus on Environmental Data and 2+ years of experience working in the environmental domain and preferably have relevant experience with environmental data. Skills Required: Ability to understand data management using excellent computer skills to perform transformations in spreadsheets and databases. Expertise and experience with environmental data and database systems (MS SQL Server, MS Access). Expertise with relational databases such as EarthSoft’s Environmental Quality Information System (EQuIS™) /EIM/ ESdat. Ability to continually analyze data at all stages for problems, logic, and consistency concerning field data collection, analytical reporting, and other expertise on EQUIS sub-tools (Collect, Edge, ArcGIS highly desirable but not essential). Assist projects globally and task delivery with high quality and within deadlines. Managing data (geological, Field data, chemical laboratory data) for technical report writing and interpretation as required by the team. Maintaining and updating various project dashboards using the web-based EQuIS Enterprise™ system; and preparing report-ready data tables, charts, and figures for internal review and external client reports. Use of visualization tools like Power BI to help management make effective decisions for the environmental domain is desirable but not essential. Programming and/or coding experience (e.g., Python,R) a plus. Data engineering, AI/ML, and Data science understanding is highly desirable but not essential. Can be in either academic or work experience. Intermediate to the expert level understanding of Office 365, Excel, power query & Power automation. Strong attention to detail with excellent analytical, judgment and problem-solving capabilities. Comfortable running meetings and presentations Strong written and oral communication skills Preferred : Master’s degree in environmental/civil/chemical engineering or science in a related discipline (or similar subject) desirable with a required focus on Environmental Data. Minimum of 2 – 5 years of experience working in the environmental domain and preferably have relevant experience with environmental data. Additional Information

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies