Home
Jobs

680 Normalization Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Job summary Amazon.com’s Buyer Risk Prevention (BRP) mission is to make Amazon the safest and most trusted place worldwide to transact online. Amazon runs one of the most dynamic e-commerce marketplaces in the world, with nearly 2 million sellers worldwide selling hundreds of millions of items in ten countries. BRP safeguards every financial transaction across all Amazon sites. As such, BRP designs and builds the software systems, risk models, and operational processes that minimize risk and maximize trust in Amazon.com. The BRP organization is looking for a data scientist for its Risk Mining Analytics (RMA) team, whose mission is to combine advanced analytics with investigator insight to detect negative customer experiences, improve system effectiveness, and prevent bad debt across Amazon. As a data scientist in risk mining, you will be responsible for modeling complex problems, discovering insights, and building risk algorithms that identify opportunities through statistical models, machine learning, and visualization techniques to improve operational efficiency and reduce bad debt. You will need to collaborate effectively with business and product leaders within BRP and cross-functional teams to build scalable solutions against high organizational standards. The candidate should be able to apply a breadth of tools, data sources, and data science techniques to answer a wide range of high-impact business questions and proactively present new insights in a concise and effective manner. The candidate should be an effective communicator capable of independently driving issues to resolution and communicating insights to non-technical audiences. This is a high-impact role with goals that directly impact the bottom line of the business. Key job responsibilities Key job responsibilities Analyze terabytes of data to define and deliver on complex analytical deep dives to unlock insights and build scalable solutions through Data Science to ensure security of Amazon’s platform and transactions Build Machine Learning and/or statistical models that evaluate the transaction legitimacy and track impact over time Ensure data quality throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, and cross-lingual alignment/mapping Define and conduct experiments to validate/reject hypotheses, and communicate insights and recommendations to Product and Tech teams Develop efficient data querying infrastructure for both offline and online use cases Collaborate with cross-functional teams from multidisciplinary science, engineering and business backgrounds to enhance current automation processes Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use. Maintain technical document and communicate results to diverse audiences with effective writing, visualizations, and presentations Basic Qualifications 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience 2+ years of data scientist experience 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience Experience applying theoretical models in an applied environment Preferred Qualifications Experience in Python, Perl, or another scripting language Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2998274

Posted 9 hours ago

Apply

5.0 years

5 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Hyderabad, Telangana Job ID 30162733 Job Category Digital Technology Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description: Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities: Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements: BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 9 hours ago

Apply

2.0 years

0 Lacs

India

On-site

GlassDoor logo

Job Title: DBMS Trainer Location: Hyderabad, Telangana Experience Required: Minimum 2 years in database development or training Employment Type: Full-Time, Onsite Job Summary: We are seeking a dynamic and experienced DBMS Trainer to join our team in Hyderabad. The ideal candidate will have a strong background in database systems, both relational and NoSQL, and a passion for mentoring and training aspiring database professionals. You will be responsible for delivering engaging, interactive, and industry-relevant training sessions on core database concepts, administration, optimization, and real-world applications. Key Responsibilities: Curriculum Development: Design, develop, and maintain comprehensive training modules covering SQL (MySQL, PostgreSQL, Oracle) , NoSQL (MongoDB, Cassandra) , database design, normalization, indexing, backup/recovery strategies, and data modeling. Training Delivery: Conduct engaging, in-person classroom and lab sessions on SQL querying, stored procedures, transactions, optimization, security best practices, and cloud DBMS concepts. Hands-On Workshops: Facilitate practical, real-world exercises including schema design , performance tuning , backup/recovery , and managing unstructured data scenarios. Mentorship & Assessment: Evaluate learners through quizzes, assignments, and capstone projects. Provide continuous feedback, interview preparation, and career counseling support. Content Updating: Regularly update course content to reflect industry advancements , including cloud databases, big data integrations, and emerging DBMS technologies. Lab & Tool Management: Set up, manage, and troubleshoot training environments (both on-premises and cloud-based), and work closely with technical teams to ensure seamless training delivery. Required Qualifications: Bachelor's degree in Computer Science, IT, ECE , or a related field. Minimum 2 years of hands-on experience in database development, administration, or technical training roles. Technical Skills: SQL Databases: MySQL, PostgreSQL, Oracle (queries, joins, transactions, stored procedures) NoSQL Databases: MongoDB, Cassandra (document modeling, indexing) Database Design & Administration: ER modeling, normalization, indexing, backup & recovery, security management Performance Tuning: Query optimization, indexing strategies, monitoring and logging tools Data Modeling: Relational and unstructured/NoSQL data structures Basic Cloud DBMS: Familiarity with AWS RDS, Azure SQL, Firebase/Firestore Version Control & Scripting: Git, basic shell/SQL scripts for automation Communication & Mentoring: Strong presentation, troubleshooting, and feedback skills Preferred Extras: Certifications such as Oracle OCA , AWS/Azure database certifications , MongoDB Certified Developer Experience with big data tools (Hive, Spark SQL) or cloud-native data platforms Experience using Learning Management Systems (LMS) and e-learning platforms

Posted 9 hours ago

Apply

2.0 - 3.0 years

15 Lacs

India

Remote

GlassDoor logo

We are seeking a skilled and detail-oriented PostgreSQL Database Developer & Designer to join our team. The ideal candidate will be responsible for designing, developing, optimizing, and maintaining scalable and secure PostgreSQL databases that support our application and business needs. Key Responsibilities: Design and develop efficient and scalable database schemas, tables, views, indexes, and stored procedures Develop and optimize complex SQL queries , functions, and triggers in PostgreSQL Perform data modeling and create ER diagrams to support business logic and performance Work closely with application developers to design and implement data access patterns Monitor database performance and tune queries for high availability and efficiency Maintain data integrity, quality, and security across all environments Develop and manage ETL processes, migrations, and backup strategies Assist in database version control and deployment automation Troubleshoot and resolve database-related issues in development and production Required Skills & Qualifications: Minimum 2–3 years of experience in PostgreSQL database development and design Strong understanding of relational database design principles , normalization, and indexing Proficient in writing complex SQL queries , functions, stored procedures, and performance tuning Experience with data modeling tools (e.g., pgModeler, dbdiagram.io, ER/Studio) Familiarity with database version control (e.g., Liquibase, Flyway) Solid understanding of PostgreSQL internals , query planner, and performance optimization techniques Knowledge of data security , encryption, and compliance standards Strong problem-solving skills and attention to detail Nice to Have (Pluses): Experience with cloud databases (e.g., Amazon RDS for PostgreSQL, Google Cloud SQL, Azure Database for PostgreSQL) Familiarity with NoSQL or hybrid data architectures Exposure to Kafka , RabbitMQ , or other message brokers Experience working in Agile/Scrum teams Knowledge of CI/CD pipelines for database deployments Understanding of data warehousing and analytics/reporting workflows What We Offer: Competitive compensation package Opportunity to work on high-impact systems and large-scale databases Collaborative team environment with growth and learning opportunities Remote-friendly and flexible work schedule Job Type: Full-time Pay: ₹1,500,000.00 per year Benefits: Health insurance Schedule: Day shift Experience: PostgreSQL: 5 years (Required) SQL: 5 years (Required) Work Location: In person Application Deadline: 05/07/2025 Expected Start Date: 01/08/2025

Posted 9 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description: We are seeking a skilled Data Engineer with strong experience in Python, Snowflake, and AWS. The ideal candidate will be responsible for building and optimizing scalable data pipelines, integrating diverse data sources, and supporting analytics and business intelligence solutions in a cloud environment. A key focus will include designing and managing AWS Glue Jobs and enabling efficient, serverless ETL workflows. Key Responsibilities: Design and implement robust data pipelines using AWS Glue, Lambda, and Python. Work extensively with Snowflake for data warehousing, modelling, and analytics support. Manage ETL/ELT jobs using AWS Glue and ensure end-to-end data reliability. Migrate data between CRM systems, especially from Snowflake to Salesforce, following defined business rules and ensuring data accuracy. Optimize SQL/SOQL queries, handle large volumes of data and maintain high levels of performance. Implement data normalization and data quality checks to ensure accurate, consistent, and deduplicated records. Required Skills: Strong programming skills in Python . Hands-on experience with Snowflake Data Warehouse . Proficiency in AWS services : Glue, S3, Lambda, Redshift, CloudWatch. Experience with ETL/ELT pipelines and data integration using AWS Glue Jobs. Proficient in SQL and SOQL for data extraction and transformation. Understanding of data modelling, normalization, and performance optimization. Nice to Have: Familiarity with Salesforce Data Loader, ETL mapping, and metadata-driven migration. Experience with CI/CD tools, DevOps, and version control (e.g., Git). Worked in Agile/Scrum environments.

Posted 9 hours ago

Apply

2.0 years

3 - 5 Lacs

Noida

On-site

GlassDoor logo

Position: Web Developer We are looking for a highly skilled Web Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹25,000.00 - ₹45,000.00 per month Work Location: In person

Posted 9 hours ago

Apply

4.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description Position and Department Details Role: Asst.Manager (Onroll) Department: Operations Hub, Data & Capability (GIX-Intelligence). Job Role: Lean Activities: Operational Excellence: Identify and implement improvement opportunities to enhance quality, reduce latency, optimize costs, and mitigate risks, driving overall efficiency and effectiveness of output. Process Governance and Compliance: Oversee and ensure that all processes are accurately documented, up-to-date, and aligned with standard operating procedures (SOPs), guaranteeing consistency and adherence to established protocols. Process Optimization and Analysis: Conduct thorough analyses of existing processes, leveraging tools such as Value Stream Mapping (VSM) and Failure Mode Effect Analysis (FMEA) to identify areas for improvement and inform data-driven decision-making. Capability Development and Training: Design and deliver training programs for employees on Lean methodologies and tools, including Root Cause Analysis (RCA), FMEA, and other relevant techniques, to enhance skills and knowledge, and foster a culture of continuous improvement. DQM Activities: Data Quality Monitoring: Develop and implement data quality monitoring processes to identify and track data quality issues, including data validation. Data Quality Reporting: Create and maintain data quality reports to track and analyze data quality metrics, including data accuracy, completeness, and consistency. Data Quality Issue Resolution: Collaborate with stakeholders to identify and resolve data quality issues, including root cause analysis and implementation of corrective actions. Data Quality Process Development: Develop and maintain data quality processes and procedures, including data validation rules, data cleansing procedures, and data normalization standards. Stakeholder Management: Communicate data quality issues and resolutions to stakeholders, including business users, data analysts, and IT teams. Process Improvement: Continuously monitor and improve data quality processes and procedures to ensure they are efficient, effective, and aligned with business needs. Compliance: Ensure data quality processes and procedures comply with regulatory requirements, including data privacy and data security regulations. Training and Development: Provide training and development opportunities to data quality team members to ensure they have the necessary skills and knowledge to perform their jobs effectively. Special Projects: Participate in special projects, including data quality assessments, data quality audits, and data quality improvement initiatives Basic Qualification: Graduate/Masters (preferably business/commerce background) with at least 4 to 6 years of experience in lean practice. Excellent working knowledge of advanced MS Excel, MS Word and MS PowerPoint, MS outlook. Good communications skills and experience in handling senior stakeholders. Certification: Lean Six Sigma Green Belt Certification is must. Preferable: Lean Six Sigma Black Belt Certified. Expectations: The individual should be a quick learner, diligent and efficient in timely completion of tasks assigned The individual should be able to think independently, logically, and critically assess the requirement and ensure troubleshooting and solutions The individual should be able to multi-task and handle multiple activities at a time The individual should have attention to detail and should be solution oriented.

Posted 10 hours ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly experienced and self-driven Senior Data Engineer to design, build, and optimize modern data pipelines and infrastructure. This role requires deep expertise in Snowflake, DBT, Python, and cloud data ecosystems. You will play a critical role in enabling data-driven decision-making across the organization by ensuring the availability, quality, and integrity of data. Key Responsibilities: Design and implement robust, scalable, and efficient data pipelines using ETL/ELT frameworks. Develop and manage data models and data warehouse architecture within Snowflake . Create and maintain DBT models for transformation, lineage tracking, and documentation. Write modular, reusable, and optimized Python scripts for data ingestion, transformation, and automation. Collaborate closely with data analysts, data scientists, and business teams to gather and fulfill data requirements. Ensure data integrity, consistency, and governance across all stages of the data lifecycle. Monitor pipeline performance and implement optimization strategies for queries and storage. Follow best practices for data engineering including version control (Git), testing, and CI/CD integration. Required Skills and Qualifications: 8+ years of experience in Data Engineering or related roles. Deep expertise in Snowflake : schema design, performance tuning, security, and access controls. Proficiency in Python , particularly for scripting, data transformation, and workflow automation. Strong understanding of data modeling techniques (e.g., star/snowflake schema, normalization). Proven experience with DBT for building modular, tested, and documented data pipelines. Familiarity with ETL/ELT tools and orchestration platforms like Apache Airflow or Prefect . Advanced SQL skills with experience handling large and complex data sets. Exposure to cloud platforms such as AWS , Azure , or GCP and their data services. Preferred Qualifications: Experience implementing data quality checks and governance frameworks. Understanding of modern data stack and CI/CD pipelines for data workflows. Contributions to data engineering best practices, open-source projects, or thought leadership.

Posted 10 hours ago

Apply

8.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

As Associate Manager, Data Engineering, You Will Lead the team of Data Engineers and develop innovative approaches on performance optimization & automation Analyzing enterprise specifics to understand current-state data schema and data model and contribute to define future-state data schema, data normalization and schema integration as required by the project Apply coding expertise, best practices and guidance in Python, SQL, Informatica and cloud data platform development to members of the team Collaborate with clients to harden, scale, and parameterize code to be scalable across brands and regions Understanding business objectives and develop business intelligence applications that help to monitor & improve critical business metrics Monitor project timelines ensuring deliverables are being met by team members Communicate frequently to stakeholders on project requirements, statuses and risks Manage the monitoring of productionized processes to ensure pipelines are executed successfully every day communicating delays as required to stakeholders Contribute to the design of scalable data integration frameworks to move and transform a variety of large data sets Develop robust work products by following best practices through all stages of development, testing & deployment Skills and Qualifications BTECH / master’s degree in a quantitative field (statistics, business analytics, computer science Team management experience is must. 8-10 Years of experience (with at least 2-4 yrs of experience in managing team) Vast background in all things data related Intermediate level of proficiency with Python and data related libraries (PySpark, Pandas, etc.) High level of proficiency with SQL (Snowflake a big plus) Snowflakes is REQUIRED. We need someone with a high level of Snowflake experience. Certification is a big plus AWS data platform development experience High level of proficiency with Data Warehousing and Data Modeling Experience with ETL tools (Informatica, Talend, DataStage) required Informatica is our tool and is required. IICS or Power Center is accepted. Ability to coach team members setting them up for success in their roles Capable of connecting with team members inspiring them to be their best The Yum! Brands story is simple. We have the four distinctive, relevant and easy global brands – KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -- born from the hopes and dreams, ambitions and grit of passionate entrepreneurs. And we want more of this to create our future! As the world’s largest restaurant company we have a clear and compelling mission: to build the world’s most love, trusted and fastest-growing restaurant brands. The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results. We’re looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences – and we are only getting started. Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role. Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless. Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFC’s still use its 75-year-old finger lickin’ good recipe including secret herbs and spices to hand-bread its chicken every day. Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world. We don’t just say we are a great place to work – our commitments to the world and our employees show it. Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index. Our employees work in an environment where the value of “believe in all people” is lived every day, enjoying benefits including but not limited to: 4 weeks’ vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match – all encompassed in Yum!’s world-famous recognition culture.

Posted 11 hours ago

Apply

7.5 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Full Stack Engineer Project Role Description : Responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. Use development skills to deliver innovative solutions that help our clients improve the services they provide. Leverage new technologies that can be applied to solve challenging business problems with a cloud first and agile mindset. Must have skills : Java Full Stack Development, Node.js Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As a Full Stack Engineer, you will be responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. You will use your development skills to deliver innovative solutions that help our clients improve the services they provide. Additionally, you will leverage new technologies to solve challenging business problems with a cloud-first and agile mindset. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Develop and engineer end-to-end features of a system. - Deliver innovative solutions to improve client services. - Utilize development skills to solve challenging business problems. - Stay updated with new technologies and apply them to projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Full Stack Development, Apache Kafka. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Java Full Stack Development. - This position is based at our Bengaluru office. - A BE degree is required.

Posted 11 hours ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities - Experience in designing and implementing the ELT architecture to build data warehouse including source-to-staging, staging-to-target mapping design - Experience in Configuring Master Repository, Work Repository, Projects, Models, Sources, Targets, Packages, Knowledge Modules, Mappings, Scenarios, Load plans, and Metadata. - Experience in creating database connections, physical and logical schema using the Topology Manager - Experience in creation of packages, construction of data warehouse and data marts, and synchronization using ODI - Experience in architecting data-related solutions, developing data warehouses, developing ELT/ETL jobs, Performance tuning and identifying bottlenecks in the process flow. - Experience using Dimensional Data modeling, Star Schema modeling, Snow-Flake modeling, - Experience using Normalization, Fact and Dimensions Tables, Physical and Logical Data Modeling. - Having Good Knowledge in Oracle cloud services and Database options. - Strong Oracle SQL expertise using tools such as SQL Developer - Understanding ERP modules is good to have Mandatory Skill Sets ODI, OAC Preferred Skill Sets ODI, OAC Years of experience required: 7 - 12 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Data Integrator (ODI) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 16 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Role & Responsibilities : Database Development and Optimization : Design, develop, and optimize SQL databases, tables, views, and stored procedures to meet business requirements and performance goals. Data Retrieval and Analysis : Write efficient and high-performing SQL queries to retrieve, manipulate, and analyze data. Data Integrity and Security : Ensure data integrity, accuracy, and security through regular monitoring, backups, and data cleansing activities. Performance Tuning : Identify and resolve database performance bottlenecks, optimizing queries and database configurations. Error Resolution : Investigate and resolve database-related issues, including errors, connectivity problems, and data inconsistencies. Cross-Functional Collaboration : Collaborate with cross-functional teams, including Data Analysts, Software Developers, and Business Analysts, to support data-driven decision-making. Maintain comprehensive documentation of database schemas, processes, and procedures. Implement and maintain security measures to protect sensitive data and ensure compliance with data protection regulations. Assist in planning and executing database upgrades and migrations. To be considered for this role, you should have : Relevant work experience as a SQL Developer or in a similar role. Education : Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience). Technical Skills Proficiency in SQL, including T-SQL for Microsoft SQL Server or PL/SQL for Oracle. Strong knowledge of database design principles, normalization, and indexing. Experience with database performance tuning and optimization techniques. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Ability to work independently and manage multiple tasks simultaneously. Desirable Skills Database Management Certifications : Certifications in database management (e. , Microsoft Certified : Azure Database Administrator Associate) are a plus. Data Warehousing Knowledge : Understanding of data warehousing concepts is a plus. (ref:hirist.tech)

Posted 20 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Develop reusable, typed frontend components using hooks and modern state management patterns. Ensure responsive UI/UX and cross-browser compatibility. Design RESTful or GraphQL APIs using Express and TypeScript. Model relational schemas and write optimized SQL queries and stored procedures. Optimize database performance using indexes, partitions, and EXPLAIN plans. Write unit and integration tests using the Jest and React Testing Library. Participate actively in code reviews and maintain coding standards. Qualifications Required Skills React.js with TypeScript (React 16+ with functional components and hooks) Node.js with TypeScript and Express MySQL (schema design, normalization, indexing, query optimization, stored procedures) HTML5, CSS3/Sass, ECMAScript 6+ Git, npm/yarn, Webpack/Vite, ESLint/Prettier, Swagger/OpenAPI Jest, React Testing Library

Posted 1 day ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Senior Engineer – Data SQL Engineer, AVP Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Your Role - What You’ll Do As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases. Key Responsibilities: Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your Skills And Experience Skills You’ll Need : Must Have: 8+ years of hands-on experience with SQL in relational databases – SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Skills Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Educational Qualifications Bachelor’s degree in Computer Science/Engineering or relevant technology & science Technology certifications from any industry leading cloud providers How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Voice Processing Specialist Location: Remote /Jaipur Job Type: Full-time / Contract Experience: 3+ years expertise in voice cloning, transformation, and synthesis technologies Job Summary We are seeking a talented and motivated Voice Processing Specialist to join our team and lead the development of innovative voice technologies. The ideal candidate will have a deep understanding of speech synthesis, voice cloning, and transformation techniques. You will play a critical role in designing, implementing, and deploying state-of-the-art voice models that enhance naturalness, personalization, and flexibility of speech in AI-powered applications. This role is perfect for someone passionate about advancing human-computer voice interaction and creating lifelike, adaptive voice systems. Key Responsibilities Design, develop, and optimize advanced deep learning models for voice cloning, text-to-speech (TTS), voice conversion, and real-time voice transformation. Implement speaker embedding and voice identity preservation techniques to support accurate and high-fidelity voice replication. Work with large-scale and diverse audio datasets, including preprocessing, segmentation, normalization, and data augmentation to improve model generalization and robustness. Collaborate closely with data scientists, ML engineers, and product teams to integrate developed voice models into production pipelines. Fine-tune neural vocoders and synthesis architectures for better voice naturalness and emotional range. Stay current with the latest advancements in speech processing, AI voice synthesis, and deep generative models through academic literature and open-source projects. Contribute to the development of tools and APIs for deploying models on cloud and edge environments with high efficiency and low latency. Required Skills Strong understanding of speech signal processing, speech synthesis, and automatic speech recognition (ASR) systems. Hands-on experience with voice cloning frameworks such as Descript Overdub, Coqui TTS, SV2TTS, Tacotron, FastSpeech, or similar. Proficiency in Python and deep learning frameworks like PyTorch or TensorFlow. Experience working with speech libraries and toolkits such as ESPnet, Kaldi, Librosa, or SpeechBrain. In-depth knowledge of mel spectrograms, vocoder architectures (e.g., WaveNet, HiFi-GAN, WaveGlow), and their role in speech synthesis. Familiarity with REST APIs, model deployment, and cloud-based inference systems using platforms like AWS, Azure, or GCP. Ability to optimize models for performance in real-time or low-latency environments. Preferred Qualifications Experience in real-time voice transformation, including pitch shifting, timing modification, or emotion modulation. Exposure to emotion-aware speech synthesis, multilingual voice models, or prosody modeling. Design, develop, and optimize advanced deep learning models for voice cloning, text-to-speech (TTS), voice conversion, and real-time voice transformation Background in audio DSP (Digital Signal Processing) and speech analysis techniques. Previous contributions to open-source speech AI projects or publications in relevant domains. Why Join Us You will be part of a fast-moving, collaborative team working at the forefront of voice AI innovation. This role offers the opportunity to make a significant impact on products that reach millions of users, helping to shape the future of interactive voice experiences. Skills: automatic speech recognition (asr),vocoder architectures,voice cloning,voice processing,data,real-time voice transformation,speech synthesis,pytorch,tensorflow,voice conversion,speech signal processing,audio dsp,rest apis,python,cloud deployment,transformation,mel spectrograms,deep learning

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: AI Image Processing Specialist Location: Remote /Jaipur Job Type: Full-time / Contract Experience: 3+ years in computer vision, with medical imaging a plus Job Summary We are seeking a highly skilled and detail-oriented AI Image Processing Specialist to join our team, with a strong focus on medical imaging , computer vision , and deep learning . In this role, you will be responsible for developing and optimizing scalable image processing pipelines tailored for diagnostic, radiological, and clinical applications. Your work will directly contribute to advancing AI capabilities in healthcare by enabling accurate, efficient, and compliant medical data analysis. You will collaborate with data scientists, software engineers, and healthcare professionals to build cutting-edge AI solutions with real-world impact. Key Responsibilities Design, develop, and maintain robust image preprocessing pipelines to handle various medical imaging formats such as DICOM, NIfTI, and JPEG2000. Build automated, containerized, and scalable computer vision workflows suitable for high-throughput medical imaging analysis. Implement and fine-tune models for core vision tasks, including image segmentation, classification, object detection, and landmark detection using deep learning techniques. Ensure that all data handling, processing, and model training pipelines adhere to regulatory guidelines such as HIPAA, GDPR, and FDA/CE requirements. Optimize performance across pipeline stages — including data augmentation, normalization, contrast adjustment, and image registration — to ensure consistent model accuracy. Integrate annotation workflows using tools such as CVAT, Labelbox, or SuperAnnotate and implement strategies for active learning and semi-supervised annotation. Manage reproducibility and version control across datasets and model artifacts using tools like DVC, MLFlow, and Airflow. Required Skills Strong experience with Python and image processing libraries such as OpenCV, scikit-image, and SimpleITK. Proficiency in deep learning frameworks like TensorFlow or PyTorch, including experience with model architectures like U-Net, ResNet, or YOLO adapted for medical applications. Deep understanding of medical imaging formats, preprocessing techniques (e.g., windowing, denoising, bias field correction), and challenges specific to healthcare datasets. Experience working with computer vision tasks such as semantic segmentation, instance segmentation, object localization, and detection. Familiarity with annotation platforms, data curation workflows, and techniques for managing large annotated datasets. Experience with pipeline orchestration, containerization (Docker), and reproducibility tools such as Airflow, DVC, or MLFlow. Preferred Qualifications Experience with domain-specific imaging datasets in radiology, pathology, dermatology, or ophthalmology. Understanding of clinical compliance frameworks such as FDA clearance for software as a medical device (SaMD) or CE marking in the EU. Exposure to multi-modal data fusion, combining imaging with EHR, genomics, or lab data for holistic model development. Experience with pipeline orchestration, containerization (Docker), and reproducibility tools such as Airflow, DVC, or MLFlow. Ensure that all data handling, processing, and model training pipelines adhere to regulatory guidelines such as HIPAA, GDPR, and FDA/CE requirements. Why Join Us Be part of a forward-thinking team shaping the future of AI in healthcare. You’ll work on impactful projects that improve patient outcomes, streamline diagnostics, and enhance clinical decision-making. We offer a collaborative environment, opportunities for innovation, and a chance to work at the cutting edge of AI-driven healthcare. Skills: docker,u-net,mlflow,containerization,image segmentation,simpleitk,yolo,image processing,computer vision,medical imaging,object detection,tensorflow,opencv,pytorch,image preprocessing,resnet,python,dvc,airflow,scikit-image,annotation workflows

Posted 1 day ago

Apply

2.0 years

3 - 4 Lacs

Noida

On-site

GlassDoor logo

modelling We are looking for a highly skilled Sr. Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Experience with Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Morning shift Education: Bachelor's (Required) Experience: Total: 2 years (Required) WordPress: 2 years (Required) PHP: 2 years (Required) Laravel: 2 years (Required) Location: Noida, Uttar Pradesh (Required) Work Location: In person

Posted 1 day ago

Apply

0 years

5 - 8 Lacs

Calcutta

On-site

GlassDoor logo

Job requisition ID :: 82238 Date: Jun 23, 2025 Location: Kolkata Designation: Associate Director Entity: Associate Director | SAP QM | Kolkata | SAP Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team SAP is about much more than just the numbers. It’s about attesting to accomplishments and challenges and helping to assure strong foundations for future aspirations. Deloitte exemplifies the what, how, and why of change so you’re always ready to act ahead Your work profile. As a Manager in our SAP Team, you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - SAP QM Professional should have: End to end project implementation experience in SAP SD in atleast 12- 15 projects (excluding support projects). Must To Have Skills: Proficiency in SAP Quality Management (QM) Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Qualifications Graduate degree (Science or Engineering) from premier institutes. Strong communication skills (written & verbal). Willingness to travel for short and long term durations. Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. Actively focuses on developing effective communication and relationship-building skills Builds own understanding of our purpose and values; explores opportunities for impact Understands expectations and demonstrates personal accountability for keeping performance on track Understands how their daily work contributes to the priorities of the team and the business Demonstrates strong commitment to personal learning and development; acts as a brand ambassador to help attract top talent How you’ll grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help build world-class skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world

Posted 1 day ago

Apply

0 years

15 - 21 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are inviting applications for the role of Lead Consultant – Java Kafka . The ideal candidate will have strong hands-on experience in Java-based microservices development using Kafka and Postgres. You will also work on Microsoft Access-based database solutions ensuring data normalization and process integrity. Primary Skills (Must-Have) Core Java (v1.8 or higher) Spring Boot & Spring Framework (Core, AOP, Batch) Apache Kafka PostgreSQL Secondary Skills (Good To Have) Google Cloud Platform (GCP) CI/CD Tools – CircleCI preferred GitHub – for version control and collaboration Monitoring Tools – Splunk, Grafana Key Responsibilities Develop and maintain enterprise-level applications using Java, Spring Boot, Kafka, and Postgres. Design, build, and maintain Microsoft Access Databases with proper normalization and referential integrity. Implement and maintain microservices architectures for high-volume applications. Participate in code reviews, unit testing, and integration testing. Manage version control with GitHub and contribute to DevOps pipelines with CI/CD tools like CircleCI. Collaborate with cross-functional teams for application development and deployment on cloud-based infrastructure (preferably GCP). Monitor system performance using Splunk and Grafana and recommend improvements. Qualifications Minimum Educational Qualifications BE / B.Tech / M.Tech / MCA in Computer Science, Information Technology, or a related field Preferred Qualifications Experience with Oracle PL/SQL, SOAP/REST Web Services Familiarity with MVC frameworks such as Struts, JSF Hands-on experience with cloud-based infrastructure, preferably GCP Skills: kafka,ci/cd tools – circleci,spring framework (core, aop, batch),monitoring tools – splunk,apache kafka,monitoring tools – grafana,spring boot,java,google cloud platform (gcp),postgresql,springboot,github,core java (v1.8 or higher)

Posted 1 day ago

Apply

5.0 - 12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title: ServiceNow CMDB Functional Consultant Skills: CMDB, CSDM, ITOM, Discovery & Event Management Experience: 5 - 12 years Locations: Greater Noida, Pune & Bengaluru Responsible for designing, implementing & maintaining a CMDB system for an organization. Collecting & organizing information about hardware, software, and other IT assets, as well as their relationships and dependencies. Must have a strong understanding of IT infrastructure & configuration management principles, as well as excellent communication and problem - solving skills Analyzing the organization's current IT infrastructure & identifying areas for improvement in terms of configuration management. Developing a strategy & roadmap for implementing a CMDB system, including identifying the necessary data sources and integration points. Collaborating with various IT teams, such as network, server & application teams, to gather and validate configuration data. Defining & documenting the data model and taxonomy for the CMDB, ensuring it aligns with industry best practices. Configuring & customizing the CMDB tool to meet the specific needs of the organization. Conducting data quality checks & implementing processes for ongoing data maintenance and governance. Providing training & support to end - users on how to use the CMDB system effectively. Collaborating with IT teams to ensure accurate & timely updates to the CMDB as changes are made to the IT infrastructure. Conducting regular audits of the CMDB to ensure data accuracy & completeness. Monitoring & reporting on key performance indicators (KPIs) related to the CMDB, such as data quality, compliance, and usage. Staying updated on industry trends & best practices in CMDB management and making recommendations for improvement. Working with external vendors & consultants as needed to support the CMDB system. Preferred Qualifications: Strong knowledge of ITOM Modules & CMDB. Should have experience with CMDB Class Manager, Class Hierarchy & CMDB Manager policies. Strong knowledge of Identification, Normalization & Reconciliation rules. Configuration of CMDB classes and attributes & provide guidance to clients and other team members on ITOM best practices. Good Knowledge of TBM Taxonomy and relationship with CMDB.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the functionality and efficiency of the applications, as well as collaborating with the team to provide solutions to work-related problems. A typical day in this role involves designing and implementing application features, troubleshooting and debugging issues, and actively participating in team discussions to contribute to the development process. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and implement application features based on business requirements. - Troubleshoot and debug issues to ensure the functionality and efficiency of applications. - Collaborate with the team to provide solutions to work-related problems. - Stay updated with the latest technologies and industry trends. - Conduct code reviews and provide constructive feedback to improve code quality. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Apache Spark. - This position is based at our Indore office. - A 15 years full-time education is required. 15 years full time education

Posted 1 day ago

Apply

4.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Exp: 4 to 6 Years Location: Chennai( T Nagar) Work mode: 5 Days Work from office Notice period: Immediate or Max 30 Days Only Job Summary: We are seeking a highly skilled and motivated MSSQL Developer with strong Database Administrator (DBA) knowledge to join our dynamic team in Chennai. This role offers a unique opportunity to contribute to the development and maintenance of our critical database systems and applications. The ideal candidate will be proficient in designing, developing, and optimizing SQL Server databases. You will work closely with development teams to ensure efficient and reliable data solutions. Responsibilities: Database Development: Design, develop, and implement database schemas, tables, stored procedures, views, functions, and triggers using T-SQL. Write efficient and optimized SQL queries for data retrieval, manipulation, and reporting. Participate in the full software development lifecycle, from requirements gathering to deployment and maintenance. Develop and maintain ETL (Extract, Transform, Load) processes using tools like SSIS (SQL Server Integration Services). Ensure data integrity, accuracy, and consistency across the database systems. Database Administration: Monitor database performance, identify bottlenecks, and implement performance tuning measures (e.g., indexing, query optimization). Troubleshoot database issues and provide timely resolutions. Develop and maintain database documentation, including data models, schemas, and operational procedures. Collaboration and Communication: Collaborate effectively with application development teams to understand their data requirements and provide optimal database solutions. Communicate technical information clearly and concisely to both technical and non-technical stakeholders. Participate in code reviews and provide constructive feedback. Stay up-to-date with the latest SQL Server features, tools, and best practices. Required Skills and Experience: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 4 to 6 years of hands-on experience as an MSSQL Developer. Proven experience in database design, development, and optimization using T-SQL. Solid understanding of relational database concepts, normalization, and data modeling. Demonstrable experience in SQL Server database administration tasks performance tuning, and security. Experience with ETL processes and tools, preferably SSIS. Familiarity with database monitoring tools and techniques. Understanding of high availability and disaster recovery concepts and implementations. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Skills: Knowledge of other database technologies (e.g., NoSQL). Experience with scripting languages like PowerShell. Familiarity with agile development methodologies.

Posted 1 day ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Role Overview As a Senior Data Solutions Architect in the Business Analytics, Automation & AI team, you will be responsible for architecting and delivering comprehensive, end-to-end data solutions across cloud and on-premises platforms in Business Intelligence and Artificial Intelligence domains. Your focus will include leading strategic data migration automation initiatives that optimize and automate the transfer of ERP, CRM, and other enterprise data to modern data platforms, ensuring data cleansing and high-quality, reliable datasets. This hands-on role also involves establishing and managing a small, high-performing team of data engineers and analysts that thrives on streamlined processes and rapid innovation. Leveraging an IT consulting mindset, experience with global enterprises and complex data ecosystems, you will inspire and nurture technical talent, driving a culture of continuous learning and development. As a leader, you will foster ambition and accountability through goal-oriented frameworks and actively contribute to transformative organizational initiatives that push beyond business as usual, pioneering digitization and data-driven transformation within the company. Key Responsibilities Architect and deliver end-to-end data solutions across cloud and on-premises platforms, including AWS, Azure, Informatica, etc. Lead strategic data migration automation initiatives, optimizing and automating the movement of ERP, CRM, and other enterprise data to modern data platforms. Drive business intelligence transformation, ensuring robust data models, efficient ETL pipelines, and scalable analytics architectures for Enterprise BI needs. Build and manage AI data architectures that support AI workflows, including handling unstructured and semi-structured data, real-time data streams, and large-scale datasets for model training and inference. Implement advanced data preprocessing steps such as data cleaning, normalization, encoding categorical variables, feature engineering, and data enrichment to prepare data optimally for AI models. Manage and mentor a team of 10 data engineers and analysts, fostering skill development in BI and AI data technologies. Collaborate with business/function stakeholders to align data architecture with business goals, ensuring solutions meet both technical and operational requirements. Establish and enforce data governance, data quality, and data security frameworks, using tools like Collibra or similar. Participate in strategic project engagements, leveraging consulting expertise to define and propose best-fit solutions. Ensure compliance with regulatory and security standards, implementing access controls, encryption, and audit mechanisms. Required Skills & Qualifications Technical Expertise: Deep hands-on experience with Informatica, AWS ( including S3, Redshift )/Azure, Databricks, and Big Data platforms. Strong proficiency in Python, SQL, and NoSQL for building scalable ETL/data pipelines and managing structured/unstructured data. Experience with data governance tools (e.g., Collibra), data modeling, and data warehouse design. Knowledge of Tableau/PowerBI/Alteryx is a must. Knowledge of ERP, CRM data structures, and integration patterns. Familiarity with AI/ML frameworks like TensorFlow, PyTorch, and LLM orchestration tools (e.g., LangChain, LlamaIndex ) to support AI model workflows. Proven skills in building modular, scalable, and automated ETL/AI pipelines with robust data quality and security controls. Certifications: Certified Solutions Architect from AWS/Microsoft (Azure)/Google Cloud. Additional certifications in Databricks, or Informatica are a plus. Consulting Experience: Proven track record in an IT consulting environment, engaging with large enterprises and MNCs in strategic data solutioning projects. Strong stakeholder management, business needs assessment, and change management skills. Leadership & Soft Skills: Experience managing and mentoring small teams, developing technical skills in BI and AI data domains. Ability to influence and align cross-functional teams and stakeholders. Excellent communication, documentation, and presentation skills. Strong problem-solving, analytical thinking, and strategic vision. Preferred Experience Leading large-scale data migration and transformation programs for ERP/CRM systems. Implementing data governance and security policies across multi-cloud environments. Working with global clients in regulated industries. Driving adoption of modern data platforms and BI/AI/automation solutions in enterprise settings. Certifications AWS Certified Solutions Architect – Professional/ Microsoft Certified: Azure Solutions Architect Expert AWS Certified Data Engineer – Professional/Databricks Certified Data Engineer Professional Educational Qualifications: Master’s/bachelor’s degree in engineering or Master of Computer Applications is required. A Masters in Business Administration (MBA) is a plus. Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing

Posted 1 day ago

Apply

7.5 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, and GenAI models. Your role involves implementing deep learning, neural networks, chatbots, and image processing in production-ready quality solutions. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Lead the implementation of large language models in AI applications. - Research and apply cutting-edge AI techniques to enhance system performance. - Contribute to the development of innovative AI solutions for complex business challenges. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Large Language Models. - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education

Posted 1 day ago

Apply

Exploring Normalization Jobs in India

The job market for normalization roles in India is growing rapidly as more companies recognize the importance of data quality and consistency. Normalization jobs involve organizing and structuring data to eliminate redundancy and improve efficiency in database management. If you are considering a career in normalization, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Delhi

These cities are known for their thriving IT sectors and have a high demand for normalization professionals.

Average Salary Range

The average salary range for normalization professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10 lakhs per annum.

Career Path

A typical career path in normalization may involve starting as a Data Analyst, progressing to a Database Administrator, and eventually becoming a Data Architect or Database Manager. With experience and additional certifications, professionals can move into roles such as Data Scientist or Business Intelligence Analyst.

Related Skills

In addition to normalization skills, professionals in this field are often expected to have knowledge of database management systems, SQL, data modeling, data warehousing, and data analysis.

Interview Questions

  • What is normalization and why is it important? (basic)
  • Explain the difference between 1NF, 2NF, and 3NF. (medium)
  • How do you identify and resolve data anomalies in a database? (medium)
  • What is denormalization and when would you use it? (advanced)
  • Can you explain the benefits of using normalization in database design? (basic)
  • Describe the process of database normalization. (medium)
  • How do you handle redundant data in a database? (medium)
  • What are the limitations of normalization? (advanced)
  • How do you ensure data integrity in a normalized database? (medium)
  • What is the role of foreign keys in normalization? (medium)
  • Explain the concept of functional dependency in normalization. (medium)
  • How do you optimize database performance while maintaining normalization? (advanced)
  • Can you give an example of a database that is not normalized and explain how you would normalize it? (medium)
  • What is the difference between horizontal and vertical partitioning in database normalization? (advanced)
  • How do you handle updates and inserts in a normalized database? (medium)
  • Explain the concept of transitive dependency in normalization. (advanced)
  • What are the steps involved in normalization? (basic)
  • How do you determine the appropriate normalization level for a database? (medium)
  • How do you handle null values in a normalized database? (medium)
  • What are the common pitfalls to avoid in normalization? (advanced)
  • How do you ensure data consistency across normalized tables? (medium)
  • Can you explain the concept of referential integrity in normalization? (medium)
  • How do you normalize a database with composite keys? (advanced)
  • Describe the benefits of using normalization in a data warehouse environment. (medium)
  • How do you handle data migration in a normalized database? (medium)

Closing Remark

As you prepare for interviews and explore job opportunities in the field of normalization, remember to showcase your expertise in database management and data structuring. With the right skills and knowledge, you can excel in this dynamic and growing field in India. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies