Home
Jobs

709 Normalization Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Role : Team Leader - Service Desk Location : Pune/Bangalore Job Summary –  Candidates with a minimum 6 years of Service Desk experience with minimum 2 years in Front Line Leadership / Management role– We are looking for candidates with domain expertise in End User Support Services, and skilled in technical troubleshooting and delivery operations management.  Passport (Mandate); Advantage - US business visa (B1) Years of experience needed – 5-8 years Technical Skills  Analytical skills  Effective Business Communication  Coaching skills  Operations Management  SLA Management  MS Office  Operational knowledge of contact center platform and ITSM tool  Performance Management skills  Conflict management skills  Capacity management  Presentation skills  Training need identification  Technical Skills-Client Technical Service Awareness – Intermediate  Technical Troubleshooting - Account Management/password reset - Advance.  Technical Troubleshooting - OS – Advance  Technical Troubleshooting - End Devices - Advance  Ticketing Tool – Advance  MS Office – Intermediate  Contact center platform operating skills – Intermediate.  Contact center platform reports – Intermediate.  Networking concepts – Intermediate  Client Process Knowledge – Advanced  DMAIC Methodology – Intermediate  Client Business Awareness – Advanced  Telephone etiquette – Expert.  Email etiquette – Expert.  Customer service skills – Expert  Knowledge Base Navigation Skills – Advanced  Analytical skills – Intermediate  Operations Management – Advanced  SLA Management – Intermediate  Effective Business Communication – Advance  Decision Making Skills – Advance  Measuring Performance/Performance Management Skills – Advance  Coaching for Success – Advance  Motivating Others – Advance  Conflict Management Skills – Advance  Patience – Advance  Managing Stress – Advance  Positive attitude to change – Advance.  Attitude to feedback/willing to learn – Advance.  Relating to Others – Advance  Influencing Others – Advance  Team Player – Advance  Insight into the Customer's Mindset – Advance  Solution Based Approach – Advance  Follow Through – Advance  Personal Credibility – Advance  Self-Development – Intermediate  Result Focus – Intermediate  Drive to Win – Intermediate  Recognize Efforts – Advanced  Approachability – Advanced  Dealing with Fairness – Expert  Fostering Teamwork - Advanced Management Skills  Supervise and review Service Desk activities.  Review and ensure compliance to standards like PCI, ISO, ISMS, BCMS by facilitating audits by internal and external teams.  Place hiring request and conducting interviews.  Work with HR and support groups to improve employee retention and satisfaction.  In-person feedback to reporting agents on daily basis regarding ticket hygiene and operational/procedural hygiene  Root cause analysis, tracking and reporting of escalation and SLA misses.  Attend change meetings and analyze potential impact to Service Desk operations.  Performance appraisal and normalization  Participate in calibration and collaboration meetings with support function leads.  Conduct new hire technical and account specific training based on the requirements.  Create, maintain, and update account training plan.  Provide hands-on assistance to team members in case of issues, both through direct intervention and mentoring  Prepare Score Cards and discuss and share feedback around improvement areas.  Identify top performers and nominate for Rewards and Recognition and appreciation.  Monitor ticket ageing reports and drive team members to work on ageing tickets.  FCR analysis - find out controllable resolution errors that could have been resolved at L1. Behavioral Skills  Good in communication  Positive energy  Positive attitude  Self-learner Qualification  Any Graduate Certification  ITIL certified. About Mphasis Mphasis applies next-generation technology to help enterprises transform businesses globally. Customer centricity is foundational to Mphasis and is reflected in the Mphasis’ Front2Back™ Transformation approach. Front2Back™ uses the exponential power of cloud and cognitive to provide hyper-personalized (C=X2C2TM=1) digital experience to clients and their end customers. Mphasis’ Service Transformation approach helps ‘shrink the core’ through the application of digital technologies across legacy environments within an enterprise, enabling businesses to stay ahead in a changing world. Mphasis’ core reference architectures and tools, speed and innovation with domain expertise and specialization are key to building strong relationships with marquee clients.

Posted 2 days ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

Role Overview: We are looking for a highly skilled and experienced ServiceNow professional (7+ years) to join our freelance technical interview panel . As a Panelist, you’ll play a critical role in assessing candidates for ServiceNow Developer, Admin, and Architect roles by conducting deep technical interviews and evaluating hands-on expertise, problem-solving skills, and platform knowledge. This is an excellent opportunity for technically strong freelancers who enjoy sharing their expertise, influencing hiring decisions, and working flexible hours remotely. Key Responsibilities: Conduct live technical interviews and evaluations over video calls (aligned to EST hours) Assess candidates’ practical expertise in: Core ServiceNow modules (ITSM, CMDB, Discovery, Incident/Change/Problem) Custom application development & configuration Client/Server-side scripting (JavaScript, Business Rules, UI Policies, Script Includes) Integrations (REST/SOAP APIs, Integration Hub) Flow Designer, Service Portal, ACLs, ATF, and CI/CD practices Review coding tasks and scenario-based architecture questions Provide detailed, structured feedback and recommendations to the hiring team Collaborate on refining technical evaluation criteria if needed Required Skills & Experience (Advanced Technical Expertise): 10+ years of extensive hands-on experience with the ServiceNow platform in enterprise-grade environments Strong command over ServiceNow Core Modules : ITSM, ITOM, CMDB, Asset & Discovery, Incident/Change/Problem/Knowledge Management Proven expertise in custom application development using scoped apps, App Engine Studio, and Now Experience UI Framework Deep proficiency in ServiceNow scripting , including: Server-side : Business Rules, Script Includes, Scheduled Jobs, GlideRecord, GlideAggregate Client-side : UI Policies, Client Scripts, UI Actions, GlideForm/GlideUser APIs Middleware logic for cross-platform communication and custom handlers Experience implementing Access Control Lists (ACLs) with dynamic filters and condition-based restrictions Expert in Service Portal customization using AngularJS widgets, Bootstrap, and custom REST endpoints Proficient in Integration Hub , Custom REST/SOAP APIs , OAuth 2.0 authentication, MID Server integrations, external system integration (e.g., SAP, Azure, Jira, Dynatrace, etc.) Hands-on with Flow Designer , Orchestration , and Event Management Expertise in ServiceNow CMDB , CI Class modeling, reconciliation rules, identification/normalization strategies, and dependency mappings Familiarity with ServiceNow Performance Tuning : Scheduled Jobs optimization, lazy loading, database indexing, client/server execution efficiency Working knowledge of Automated Test Framework (ATF) and integration with CI/CD pipelines (Jenkins, Git, Azure DevOps) Understanding of ServiceNow DevOps , version control, scoped app publishing, and update set migration best practices Knowledge of Security Operations (SecOps) and Governance, Risk & Compliance (GRC) is a plus Experience guiding architectural decisions, governance models, and platform upgrade strategies Prior experience conducting technical interviews, design evaluations , or acting as a technical SME/panelist Excellent communication and feedback documentation skills — able to clearly explain technical rationale and candidate assessments Comfortable working independently and engaging with global stakeholders during USA EST hours (after 8 PM IST)

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Job summary Amazon.com’s Buyer Risk Prevention (BRP) mission is to make Amazon the safest and most trusted place worldwide to transact online. Amazon runs one of the most dynamic e-commerce marketplaces in the world, with nearly 2 million sellers worldwide selling hundreds of millions of items in ten countries. BRP safeguards every financial transaction across all Amazon sites. As such, BRP designs and builds the software systems, risk models, and operational processes that minimize risk and maximize trust in Amazon.com. The BRP organization is looking for a data scientist for its Risk Mining Analytics (RMA) team, whose mission is to combine advanced analytics with investigator insight to detect negative customer experiences, improve system effectiveness, and prevent bad debt across Amazon. As a data scientist in risk mining, you will be responsible for modeling complex problems, discovering insights, and building risk algorithms that identify opportunities through statistical models, machine learning, and visualization techniques to improve operational efficiency and reduce bad debt. You will need to collaborate effectively with business and product leaders within BRP and cross-functional teams to build scalable solutions against high organizational standards. The candidate should be able to apply a breadth of tools, data sources, and data science techniques to answer a wide range of high-impact business questions and proactively present new insights in a concise and effective manner. The candidate should be an effective communicator capable of independently driving issues to resolution and communicating insights to non-technical audiences. This is a high-impact role with goals that directly impact the bottom line of the business. Key job responsibilities Key job responsibilities Analyze terabytes of data to define and deliver on complex analytical deep dives to unlock insights and build scalable solutions through Data Science to ensure security of Amazon’s platform and transactions Build Machine Learning and/or statistical models that evaluate the transaction legitimacy and track impact over time Ensure data quality throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, and cross-lingual alignment/mapping Define and conduct experiments to validate/reject hypotheses, and communicate insights and recommendations to Product and Tech teams Develop efficient data querying infrastructure for both offline and online use cases Collaborate with cross-functional teams from multidisciplinary science, engineering and business backgrounds to enhance current automation processes Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use. Maintain technical document and communicate results to diverse audiences with effective writing, visualizations, and presentations Basic Qualifications 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience 2+ years of data scientist experience 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience Experience applying theoretical models in an applied environment Preferred Qualifications Experience in Python, Perl, or another scripting language Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2998274

Posted 2 days ago

Apply

5.0 years

5 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Hyderabad, Telangana Job ID 30162733 Job Category Digital Technology Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description: Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities: Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements: BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 2 days ago

Apply

2.0 years

0 Lacs

India

On-site

GlassDoor logo

Job Title: DBMS Trainer Location: Hyderabad, Telangana Experience Required: Minimum 2 years in database development or training Employment Type: Full-Time, Onsite Job Summary: We are seeking a dynamic and experienced DBMS Trainer to join our team in Hyderabad. The ideal candidate will have a strong background in database systems, both relational and NoSQL, and a passion for mentoring and training aspiring database professionals. You will be responsible for delivering engaging, interactive, and industry-relevant training sessions on core database concepts, administration, optimization, and real-world applications. Key Responsibilities: Curriculum Development: Design, develop, and maintain comprehensive training modules covering SQL (MySQL, PostgreSQL, Oracle) , NoSQL (MongoDB, Cassandra) , database design, normalization, indexing, backup/recovery strategies, and data modeling. Training Delivery: Conduct engaging, in-person classroom and lab sessions on SQL querying, stored procedures, transactions, optimization, security best practices, and cloud DBMS concepts. Hands-On Workshops: Facilitate practical, real-world exercises including schema design , performance tuning , backup/recovery , and managing unstructured data scenarios. Mentorship & Assessment: Evaluate learners through quizzes, assignments, and capstone projects. Provide continuous feedback, interview preparation, and career counseling support. Content Updating: Regularly update course content to reflect industry advancements , including cloud databases, big data integrations, and emerging DBMS technologies. Lab & Tool Management: Set up, manage, and troubleshoot training environments (both on-premises and cloud-based), and work closely with technical teams to ensure seamless training delivery. Required Qualifications: Bachelor's degree in Computer Science, IT, ECE , or a related field. Minimum 2 years of hands-on experience in database development, administration, or technical training roles. Technical Skills: SQL Databases: MySQL, PostgreSQL, Oracle (queries, joins, transactions, stored procedures) NoSQL Databases: MongoDB, Cassandra (document modeling, indexing) Database Design & Administration: ER modeling, normalization, indexing, backup & recovery, security management Performance Tuning: Query optimization, indexing strategies, monitoring and logging tools Data Modeling: Relational and unstructured/NoSQL data structures Basic Cloud DBMS: Familiarity with AWS RDS, Azure SQL, Firebase/Firestore Version Control & Scripting: Git, basic shell/SQL scripts for automation Communication & Mentoring: Strong presentation, troubleshooting, and feedback skills Preferred Extras: Certifications such as Oracle OCA , AWS/Azure database certifications , MongoDB Certified Developer Experience with big data tools (Hive, Spark SQL) or cloud-native data platforms Experience using Learning Management Systems (LMS) and e-learning platforms

Posted 2 days ago

Apply

2.0 - 3.0 years

15 Lacs

India

Remote

GlassDoor logo

We are seeking a skilled and detail-oriented PostgreSQL Database Developer & Designer to join our team. The ideal candidate will be responsible for designing, developing, optimizing, and maintaining scalable and secure PostgreSQL databases that support our application and business needs. Key Responsibilities: Design and develop efficient and scalable database schemas, tables, views, indexes, and stored procedures Develop and optimize complex SQL queries , functions, and triggers in PostgreSQL Perform data modeling and create ER diagrams to support business logic and performance Work closely with application developers to design and implement data access patterns Monitor database performance and tune queries for high availability and efficiency Maintain data integrity, quality, and security across all environments Develop and manage ETL processes, migrations, and backup strategies Assist in database version control and deployment automation Troubleshoot and resolve database-related issues in development and production Required Skills & Qualifications: Minimum 2–3 years of experience in PostgreSQL database development and design Strong understanding of relational database design principles , normalization, and indexing Proficient in writing complex SQL queries , functions, stored procedures, and performance tuning Experience with data modeling tools (e.g., pgModeler, dbdiagram.io, ER/Studio) Familiarity with database version control (e.g., Liquibase, Flyway) Solid understanding of PostgreSQL internals , query planner, and performance optimization techniques Knowledge of data security , encryption, and compliance standards Strong problem-solving skills and attention to detail Nice to Have (Pluses): Experience with cloud databases (e.g., Amazon RDS for PostgreSQL, Google Cloud SQL, Azure Database for PostgreSQL) Familiarity with NoSQL or hybrid data architectures Exposure to Kafka , RabbitMQ , or other message brokers Experience working in Agile/Scrum teams Knowledge of CI/CD pipelines for database deployments Understanding of data warehousing and analytics/reporting workflows What We Offer: Competitive compensation package Opportunity to work on high-impact systems and large-scale databases Collaborative team environment with growth and learning opportunities Remote-friendly and flexible work schedule Job Type: Full-time Pay: ₹1,500,000.00 per year Benefits: Health insurance Schedule: Day shift Experience: PostgreSQL: 5 years (Required) SQL: 5 years (Required) Work Location: In person Application Deadline: 05/07/2025 Expected Start Date: 01/08/2025

Posted 2 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: We are seeking a skilled Data Engineer with strong experience in Python, Snowflake, and AWS. The ideal candidate will be responsible for building and optimizing scalable data pipelines, integrating diverse data sources, and supporting analytics and business intelligence solutions in a cloud environment. A key focus will include designing and managing AWS Glue Jobs and enabling efficient, serverless ETL workflows. Key Responsibilities: Design and implement robust data pipelines using AWS Glue, Lambda, and Python. Work extensively with Snowflake for data warehousing, modelling, and analytics support. Manage ETL/ELT jobs using AWS Glue and ensure end-to-end data reliability. Migrate data between CRM systems, especially from Snowflake to Salesforce, following defined business rules and ensuring data accuracy. Optimize SQL/SOQL queries, handle large volumes of data and maintain high levels of performance. Implement data normalization and data quality checks to ensure accurate, consistent, and deduplicated records. Required Skills: Strong programming skills in Python . Hands-on experience with Snowflake Data Warehouse . Proficiency in AWS services : Glue, S3, Lambda, Redshift, CloudWatch. Experience with ETL/ELT pipelines and data integration using AWS Glue Jobs. Proficient in SQL and SOQL for data extraction and transformation. Understanding of data modelling, normalization, and performance optimization. Nice to Have: Familiarity with Salesforce Data Loader, ETL mapping, and metadata-driven migration. Experience with CI/CD tools, DevOps, and version control (e.g., Git). Worked in Agile/Scrum environments.

Posted 2 days ago

Apply

2.0 years

3 - 5 Lacs

Noida

On-site

GlassDoor logo

Position: Web Developer We are looking for a highly skilled Web Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹25,000.00 - ₹45,000.00 per month Work Location: In person

Posted 2 days ago

Apply

4.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description Position and Department Details Role: Asst.Manager (Onroll) Department: Operations Hub, Data & Capability (GIX-Intelligence). Job Role: Lean Activities: Operational Excellence: Identify and implement improvement opportunities to enhance quality, reduce latency, optimize costs, and mitigate risks, driving overall efficiency and effectiveness of output. Process Governance and Compliance: Oversee and ensure that all processes are accurately documented, up-to-date, and aligned with standard operating procedures (SOPs), guaranteeing consistency and adherence to established protocols. Process Optimization and Analysis: Conduct thorough analyses of existing processes, leveraging tools such as Value Stream Mapping (VSM) and Failure Mode Effect Analysis (FMEA) to identify areas for improvement and inform data-driven decision-making. Capability Development and Training: Design and deliver training programs for employees on Lean methodologies and tools, including Root Cause Analysis (RCA), FMEA, and other relevant techniques, to enhance skills and knowledge, and foster a culture of continuous improvement. DQM Activities: Data Quality Monitoring: Develop and implement data quality monitoring processes to identify and track data quality issues, including data validation. Data Quality Reporting: Create and maintain data quality reports to track and analyze data quality metrics, including data accuracy, completeness, and consistency. Data Quality Issue Resolution: Collaborate with stakeholders to identify and resolve data quality issues, including root cause analysis and implementation of corrective actions. Data Quality Process Development: Develop and maintain data quality processes and procedures, including data validation rules, data cleansing procedures, and data normalization standards. Stakeholder Management: Communicate data quality issues and resolutions to stakeholders, including business users, data analysts, and IT teams. Process Improvement: Continuously monitor and improve data quality processes and procedures to ensure they are efficient, effective, and aligned with business needs. Compliance: Ensure data quality processes and procedures comply with regulatory requirements, including data privacy and data security regulations. Training and Development: Provide training and development opportunities to data quality team members to ensure they have the necessary skills and knowledge to perform their jobs effectively. Special Projects: Participate in special projects, including data quality assessments, data quality audits, and data quality improvement initiatives Basic Qualification: Graduate/Masters (preferably business/commerce background) with at least 4 to 6 years of experience in lean practice. Excellent working knowledge of advanced MS Excel, MS Word and MS PowerPoint, MS outlook. Good communications skills and experience in handling senior stakeholders. Certification: Lean Six Sigma Green Belt Certification is must. Preferable: Lean Six Sigma Black Belt Certified. Expectations: The individual should be a quick learner, diligent and efficient in timely completion of tasks assigned The individual should be able to think independently, logically, and critically assess the requirement and ensure troubleshooting and solutions The individual should be able to multi-task and handle multiple activities at a time The individual should have attention to detail and should be solution oriented.

Posted 2 days ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly experienced and self-driven Senior Data Engineer to design, build, and optimize modern data pipelines and infrastructure. This role requires deep expertise in Snowflake, DBT, Python, and cloud data ecosystems. You will play a critical role in enabling data-driven decision-making across the organization by ensuring the availability, quality, and integrity of data. Key Responsibilities: Design and implement robust, scalable, and efficient data pipelines using ETL/ELT frameworks. Develop and manage data models and data warehouse architecture within Snowflake . Create and maintain DBT models for transformation, lineage tracking, and documentation. Write modular, reusable, and optimized Python scripts for data ingestion, transformation, and automation. Collaborate closely with data analysts, data scientists, and business teams to gather and fulfill data requirements. Ensure data integrity, consistency, and governance across all stages of the data lifecycle. Monitor pipeline performance and implement optimization strategies for queries and storage. Follow best practices for data engineering including version control (Git), testing, and CI/CD integration. Required Skills and Qualifications: 8+ years of experience in Data Engineering or related roles. Deep expertise in Snowflake : schema design, performance tuning, security, and access controls. Proficiency in Python , particularly for scripting, data transformation, and workflow automation. Strong understanding of data modeling techniques (e.g., star/snowflake schema, normalization). Proven experience with DBT for building modular, tested, and documented data pipelines. Familiarity with ETL/ELT tools and orchestration platforms like Apache Airflow or Prefect . Advanced SQL skills with experience handling large and complex data sets. Exposure to cloud platforms such as AWS , Azure , or GCP and their data services. Preferred Qualifications: Experience implementing data quality checks and governance frameworks. Understanding of modern data stack and CI/CD pipelines for data workflows. Contributions to data engineering best practices, open-source projects, or thought leadership.

Posted 2 days ago

Apply

8.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

As Associate Manager, Data Engineering, You Will Lead the team of Data Engineers and develop innovative approaches on performance optimization & automation Analyzing enterprise specifics to understand current-state data schema and data model and contribute to define future-state data schema, data normalization and schema integration as required by the project Apply coding expertise, best practices and guidance in Python, SQL, Informatica and cloud data platform development to members of the team Collaborate with clients to harden, scale, and parameterize code to be scalable across brands and regions Understanding business objectives and develop business intelligence applications that help to monitor & improve critical business metrics Monitor project timelines ensuring deliverables are being met by team members Communicate frequently to stakeholders on project requirements, statuses and risks Manage the monitoring of productionized processes to ensure pipelines are executed successfully every day communicating delays as required to stakeholders Contribute to the design of scalable data integration frameworks to move and transform a variety of large data sets Develop robust work products by following best practices through all stages of development, testing & deployment Skills and Qualifications BTECH / master’s degree in a quantitative field (statistics, business analytics, computer science Team management experience is must. 8-10 Years of experience (with at least 2-4 yrs of experience in managing team) Vast background in all things data related Intermediate level of proficiency with Python and data related libraries (PySpark, Pandas, etc.) High level of proficiency with SQL (Snowflake a big plus) Snowflakes is REQUIRED. We need someone with a high level of Snowflake experience. Certification is a big plus AWS data platform development experience High level of proficiency with Data Warehousing and Data Modeling Experience with ETL tools (Informatica, Talend, DataStage) required Informatica is our tool and is required. IICS or Power Center is accepted. Ability to coach team members setting them up for success in their roles Capable of connecting with team members inspiring them to be their best The Yum! Brands story is simple. We have the four distinctive, relevant and easy global brands – KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -- born from the hopes and dreams, ambitions and grit of passionate entrepreneurs. And we want more of this to create our future! As the world’s largest restaurant company we have a clear and compelling mission: to build the world’s most love, trusted and fastest-growing restaurant brands. The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results. We’re looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences – and we are only getting started. Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role. Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless. Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFC’s still use its 75-year-old finger lickin’ good recipe including secret herbs and spices to hand-bread its chicken every day. Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world. We don’t just say we are a great place to work – our commitments to the world and our employees show it. Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index. Our employees work in an environment where the value of “believe in all people” is lived every day, enjoying benefits including but not limited to: 4 weeks’ vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match – all encompassed in Yum!’s world-famous recognition culture.

Posted 2 days ago

Apply

7.5 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Full Stack Engineer Project Role Description : Responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. Use development skills to deliver innovative solutions that help our clients improve the services they provide. Leverage new technologies that can be applied to solve challenging business problems with a cloud first and agile mindset. Must have skills : Java Full Stack Development, Node.js Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As a Full Stack Engineer, you will be responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. You will use your development skills to deliver innovative solutions that help our clients improve the services they provide. Additionally, you will leverage new technologies to solve challenging business problems with a cloud-first and agile mindset. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Develop and engineer end-to-end features of a system. - Deliver innovative solutions to improve client services. - Utilize development skills to solve challenging business problems. - Stay updated with new technologies and apply them to projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Full Stack Development, Apache Kafka. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Java Full Stack Development. - This position is based at our Bengaluru office. - A BE degree is required.

Posted 2 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities - Experience in designing and implementing the ELT architecture to build data warehouse including source-to-staging, staging-to-target mapping design - Experience in Configuring Master Repository, Work Repository, Projects, Models, Sources, Targets, Packages, Knowledge Modules, Mappings, Scenarios, Load plans, and Metadata. - Experience in creating database connections, physical and logical schema using the Topology Manager - Experience in creation of packages, construction of data warehouse and data marts, and synchronization using ODI - Experience in architecting data-related solutions, developing data warehouses, developing ELT/ETL jobs, Performance tuning and identifying bottlenecks in the process flow. - Experience using Dimensional Data modeling, Star Schema modeling, Snow-Flake modeling, - Experience using Normalization, Fact and Dimensions Tables, Physical and Logical Data Modeling. - Having Good Knowledge in Oracle cloud services and Database options. - Strong Oracle SQL expertise using tools such as SQL Developer - Understanding ERP modules is good to have Mandatory Skill Sets ODI, OAC Preferred Skill Sets ODI, OAC Years of experience required: 7 - 12 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Data Integrator (ODI) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 days ago

Apply

0.0 - 9.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

Senior Scrum Master Hyderabad, India Information Technology 316176 Job Description About The Role: Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team: The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact: The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s in it for you: Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities and Impact: Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For: Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications: Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India

Posted 2 days ago

Apply

0.0 - 9.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

About the Role: Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team: The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact: The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s in it for you: Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities and Impact: Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For: Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications: Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India

Posted 2 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Role & Responsibilities : Database Development and Optimization : Design, develop, and optimize SQL databases, tables, views, and stored procedures to meet business requirements and performance goals. Data Retrieval and Analysis : Write efficient and high-performing SQL queries to retrieve, manipulate, and analyze data. Data Integrity and Security : Ensure data integrity, accuracy, and security through regular monitoring, backups, and data cleansing activities. Performance Tuning : Identify and resolve database performance bottlenecks, optimizing queries and database configurations. Error Resolution : Investigate and resolve database-related issues, including errors, connectivity problems, and data inconsistencies. Cross-Functional Collaboration : Collaborate with cross-functional teams, including Data Analysts, Software Developers, and Business Analysts, to support data-driven decision-making. Maintain comprehensive documentation of database schemas, processes, and procedures. Implement and maintain security measures to protect sensitive data and ensure compliance with data protection regulations. Assist in planning and executing database upgrades and migrations. To be considered for this role, you should have : Relevant work experience as a SQL Developer or in a similar role. Education : Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Technical Skills Proficiency in SQL, including T-SQL for Microsoft SQL Server or PL/SQL for Oracle. Strong knowledge of database design principles, normalization, and indexing. Experience with database performance tuning and optimization techniques. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Ability to work independently and manage multiple tasks simultaneously. Desirable Skills Database Management Certifications : Certifications in database management (e. , Microsoft Certified : Azure Database Administrator Associate) are a plus. Data Warehousing Knowledge : Understanding of data warehousing concepts is a plus. (ref:hirist.tech)

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice

Posted 3 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Develop reusable, typed frontend components using hooks and modern state management patterns. Ensure responsive UI/UX and cross-browser compatibility. Design RESTful or GraphQL APIs using Express and TypeScript. Model relational schemas and write optimized SQL queries and stored procedures. Optimize database performance using indexes, partitions, and EXPLAIN plans. Write unit and integration tests using the Jest and React Testing Library. Participate actively in code reviews and maintain coding standards. Qualifications Required Skills React.js with TypeScript (React 16+ with functional components and hooks) Node.js with TypeScript and Express MySQL (schema design, normalization, indexing, query optimization, stored procedures) HTML5, CSS3/Sass, ECMAScript 6+ Git, npm/yarn, Webpack/Vite, ESLint/Prettier, Swagger/OpenAPI Jest, React Testing Library

Posted 3 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Senior Engineer – Data SQL Engineer, AVP Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Your Role - What You’ll Do As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases. Key Responsibilities: Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your Skills And Experience Skills You’ll Need : Must Have: 8+ years of hands-on experience with SQL in relational databases – SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Skills Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Educational Qualifications Bachelor’s degree in Computer Science/Engineering or relevant technology & science Technology certifications from any industry leading cloud providers How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 days ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Voice Processing Specialist Location: Remote /Jaipur Job Type: Full-time / Contract Experience: 3+ years expertise in voice cloning, transformation, and synthesis technologies Job Summary We are seeking a talented and motivated Voice Processing Specialist to join our team and lead the development of innovative voice technologies. The ideal candidate will have a deep understanding of speech synthesis, voice cloning, and transformation techniques. You will play a critical role in designing, implementing, and deploying state-of-the-art voice models that enhance naturalness, personalization, and flexibility of speech in AI-powered applications. This role is perfect for someone passionate about advancing human-computer voice interaction and creating lifelike, adaptive voice systems. Key Responsibilities Design, develop, and optimize advanced deep learning models for voice cloning, text-to-speech (TTS), voice conversion, and real-time voice transformation. Implement speaker embedding and voice identity preservation techniques to support accurate and high-fidelity voice replication. Work with large-scale and diverse audio datasets, including preprocessing, segmentation, normalization, and data augmentation to improve model generalization and robustness. Collaborate closely with data scientists, ML engineers, and product teams to integrate developed voice models into production pipelines. Fine-tune neural vocoders and synthesis architectures for better voice naturalness and emotional range. Stay current with the latest advancements in speech processing, AI voice synthesis, and deep generative models through academic literature and open-source projects. Contribute to the development of tools and APIs for deploying models on cloud and edge environments with high efficiency and low latency. Required Skills Strong understanding of speech signal processing, speech synthesis, and automatic speech recognition (ASR) systems. Hands-on experience with voice cloning frameworks such as Descript Overdub, Coqui TTS, SV2TTS, Tacotron, FastSpeech, or similar. Proficiency in Python and deep learning frameworks like PyTorch or TensorFlow. Experience working with speech libraries and toolkits such as ESPnet, Kaldi, Librosa, or SpeechBrain. In-depth knowledge of mel spectrograms, vocoder architectures (e.g., WaveNet, HiFi-GAN, WaveGlow), and their role in speech synthesis. Familiarity with REST APIs, model deployment, and cloud-based inference systems using platforms like AWS, Azure, or GCP. Ability to optimize models for performance in real-time or low-latency environments. Preferred Qualifications Experience in real-time voice transformation, including pitch shifting, timing modification, or emotion modulation. Exposure to emotion-aware speech synthesis, multilingual voice models, or prosody modeling. Design, develop, and optimize advanced deep learning models for voice cloning, text-to-speech (TTS), voice conversion, and real-time voice transformation Background in audio DSP (Digital Signal Processing) and speech analysis techniques. Previous contributions to open-source speech AI projects or publications in relevant domains. Why Join Us You will be part of a fast-moving, collaborative team working at the forefront of voice AI innovation. This role offers the opportunity to make a significant impact on products that reach millions of users, helping to shape the future of interactive voice experiences. Skills: automatic speech recognition (asr),vocoder architectures,voice cloning,voice processing,data,real-time voice transformation,speech synthesis,pytorch,tensorflow,voice conversion,speech signal processing,audio dsp,rest apis,python,cloud deployment,transformation,mel spectrograms,deep learning

Posted 3 days ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: AI Image Processing Specialist Location: Remote /Jaipur Job Type: Full-time / Contract Experience: 3+ years in computer vision, with medical imaging a plus Job Summary We are seeking a highly skilled and detail-oriented AI Image Processing Specialist to join our team, with a strong focus on medical imaging , computer vision , and deep learning . In this role, you will be responsible for developing and optimizing scalable image processing pipelines tailored for diagnostic, radiological, and clinical applications. Your work will directly contribute to advancing AI capabilities in healthcare by enabling accurate, efficient, and compliant medical data analysis. You will collaborate with data scientists, software engineers, and healthcare professionals to build cutting-edge AI solutions with real-world impact. Key Responsibilities Design, develop, and maintain robust image preprocessing pipelines to handle various medical imaging formats such as DICOM, NIfTI, and JPEG2000. Build automated, containerized, and scalable computer vision workflows suitable for high-throughput medical imaging analysis. Implement and fine-tune models for core vision tasks, including image segmentation, classification, object detection, and landmark detection using deep learning techniques. Ensure that all data handling, processing, and model training pipelines adhere to regulatory guidelines such as HIPAA, GDPR, and FDA/CE requirements. Optimize performance across pipeline stages — including data augmentation, normalization, contrast adjustment, and image registration — to ensure consistent model accuracy. Integrate annotation workflows using tools such as CVAT, Labelbox, or SuperAnnotate and implement strategies for active learning and semi-supervised annotation. Manage reproducibility and version control across datasets and model artifacts using tools like DVC, MLFlow, and Airflow. Required Skills Strong experience with Python and image processing libraries such as OpenCV, scikit-image, and SimpleITK. Proficiency in deep learning frameworks like TensorFlow or PyTorch, including experience with model architectures like U-Net, ResNet, or YOLO adapted for medical applications. Deep understanding of medical imaging formats, preprocessing techniques (e.g., windowing, denoising, bias field correction), and challenges specific to healthcare datasets. Experience working with computer vision tasks such as semantic segmentation, instance segmentation, object localization, and detection. Familiarity with annotation platforms, data curation workflows, and techniques for managing large annotated datasets. Experience with pipeline orchestration, containerization (Docker), and reproducibility tools such as Airflow, DVC, or MLFlow. Preferred Qualifications Experience with domain-specific imaging datasets in radiology, pathology, dermatology, or ophthalmology. Understanding of clinical compliance frameworks such as FDA clearance for software as a medical device (SaMD) or CE marking in the EU. Exposure to multi-modal data fusion, combining imaging with EHR, genomics, or lab data for holistic model development. Experience with pipeline orchestration, containerization (Docker), and reproducibility tools such as Airflow, DVC, or MLFlow. Ensure that all data handling, processing, and model training pipelines adhere to regulatory guidelines such as HIPAA, GDPR, and FDA/CE requirements. Why Join Us Be part of a forward-thinking team shaping the future of AI in healthcare. You’ll work on impactful projects that improve patient outcomes, streamline diagnostics, and enhance clinical decision-making. We offer a collaborative environment, opportunities for innovation, and a chance to work at the cutting edge of AI-driven healthcare. Skills: docker,u-net,mlflow,containerization,image segmentation,simpleitk,yolo,image processing,computer vision,medical imaging,object detection,tensorflow,opencv,pytorch,image preprocessing,resnet,python,dvc,airflow,scikit-image,annotation workflows

Posted 3 days ago

Apply

2.0 years

3 - 4 Lacs

Noida

On-site

GlassDoor logo

modelling We are looking for a highly skilled Sr. Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Experience with Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Morning shift Education: Bachelor's (Required) Experience: Total: 2 years (Required) WordPress: 2 years (Required) PHP: 2 years (Required) Laravel: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 3 days ago

Apply

0 years

5 - 8 Lacs

Calcutta

On-site

GlassDoor logo

Job requisition ID :: 82238 Date: Jun 23, 2025 Location: Kolkata Designation: Associate Director Entity: Associate Director | SAP QM | Kolkata | SAP Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team SAP is about much more than just the numbers. It’s about attesting to accomplishments and challenges and helping to assure strong foundations for future aspirations. Deloitte exemplifies the what, how, and why of change so you’re always ready to act ahead Your work profile. As a Manager in our SAP Team, you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - SAP QM Professional should have: End to end project implementation experience in SAP SD in atleast 12- 15 projects (excluding support projects). Must To Have Skills: Proficiency in SAP Quality Management (QM) Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Qualifications Graduate degree (Science or Engineering) from premier institutes. Strong communication skills (written & verbal). Willingness to travel for short and long term durations. Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. Actively focuses on developing effective communication and relationship-building skills Builds own understanding of our purpose and values; explores opportunities for impact Understands expectations and demonstrates personal accountability for keeping performance on track Understands how their daily work contributes to the priorities of the team and the business Demonstrates strong commitment to personal learning and development; acts as a brand ambassador to help attract top talent How you’ll grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help build world-class skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world

Posted 3 days ago

Apply

0 years

15 - 21 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are inviting applications for the role of Lead Consultant – Java Kafka . The ideal candidate will have strong hands-on experience in Java-based microservices development using Kafka and Postgres. You will also work on Microsoft Access-based database solutions ensuring data normalization and process integrity. Primary Skills (Must-Have) Core Java (v1.8 or higher) Spring Boot & Spring Framework (Core, AOP, Batch) Apache Kafka PostgreSQL Secondary Skills (Good To Have) Google Cloud Platform (GCP) CI/CD Tools – CircleCI preferred GitHub – for version control and collaboration Monitoring Tools – Splunk, Grafana Key Responsibilities Develop and maintain enterprise-level applications using Java, Spring Boot, Kafka, and Postgres. Design, build, and maintain Microsoft Access Databases with proper normalization and referential integrity. Implement and maintain microservices architectures for high-volume applications. Participate in code reviews, unit testing, and integration testing. Manage version control with GitHub and contribute to DevOps pipelines with CI/CD tools like CircleCI. Collaborate with cross-functional teams for application development and deployment on cloud-based infrastructure (preferably GCP). Monitor system performance using Splunk and Grafana and recommend improvements. Qualifications Minimum Educational Qualifications BE / B.Tech / M.Tech / MCA in Computer Science, Information Technology, or a related field Preferred Qualifications Experience with Oracle PL/SQL, SOAP/REST Web Services Familiarity with MVC frameworks such as Struts, JSF Hands-on experience with cloud-based infrastructure, preferably GCP Skills: kafka,ci/cd tools – circleci,spring framework (core, aop, batch),monitoring tools – splunk,apache kafka,monitoring tools – grafana,spring boot,java,google cloud platform (gcp),postgresql,springboot,github,core java (v1.8 or higher)

Posted 3 days ago

Apply

5.0 - 12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title: ServiceNow CMDB Functional Consultant Skills: CMDB, CSDM, ITOM, Discovery & Event Management Experience: 5 - 12 years Locations: Greater Noida, Pune & Bengaluru Responsible for designing, implementing & maintaining a CMDB system for an organization. Collecting & organizing information about hardware, software, and other IT assets, as well as their relationships and dependencies. Must have a strong understanding of IT infrastructure & configuration management principles, as well as excellent communication and problem - solving skills Analyzing the organization's current IT infrastructure & identifying areas for improvement in terms of configuration management. Developing a strategy & roadmap for implementing a CMDB system, including identifying the necessary data sources and integration points. Collaborating with various IT teams, such as network, server & application teams, to gather and validate configuration data. Defining & documenting the data model and taxonomy for the CMDB, ensuring it aligns with industry best practices. Configuring & customizing the CMDB tool to meet the specific needs of the organization. Conducting data quality checks & implementing processes for ongoing data maintenance and governance. Providing training & support to end - users on how to use the CMDB system effectively. Collaborating with IT teams to ensure accurate & timely updates to the CMDB as changes are made to the IT infrastructure. Conducting regular audits of the CMDB to ensure data accuracy & completeness. Monitoring & reporting on key performance indicators (KPIs) related to the CMDB, such as data quality, compliance, and usage. Staying updated on industry trends & best practices in CMDB management and making recommendations for improvement. Working with external vendors & consultants as needed to support the CMDB system. Preferred Qualifications: Strong knowledge of ITOM Modules & CMDB. Should have experience with CMDB Class Manager, Class Hierarchy & CMDB Manager policies. Strong knowledge of Identification, Normalization & Reconciliation rules. Configuration of CMDB classes and attributes & provide guidance to clients and other team members on ITOM best practices. Good Knowledge of TBM Taxonomy and relationship with CMDB.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies