Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 6.0 years
5 - 6 Lacs
Nagercoil
Work from Office
Job Summary: We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (e.g., Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.
Posted 1 week ago
4.0 - 9.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s Develop and implement data governance solutions using Informatica CDGC. Configure and manage metadata ingestion, lineage, and data cataloging functionalities. Collaborate with data stewards to define and enforce data governance policies and standards. Design and implement data quality rules and metrics to monitor and improve data accuracy. Integrate CDGC with other enterprise systems and data sources for seamless metadata management. Work with business users to capture and maintain business glossaries and data dictionaries. Conduct data profiling and analysis to support data governance initiatives. Provide training and support to users on leveraging CDGC for data governance and cataloging. Participate in solution design reviews, troubleshooting, and performance tuning. Stay updated with the latest trends and best practices in data governance and cataloging. Mandatory skill sets 4+ years of experience in data governance and cataloging, with at least 1 year on the Informatica CDGC platform. Proficiency in configuring and managing Informatica CDGC components. Ability to integrate CDGC with various data sources and enterprise system Experience in debugging issues & applying fixes in Informatica CDGC In depth understanding of Data Management landscape including technology landscape, standards, and best practices prevalent in data governance, metadata management, cataloging, data lineage, data quality, and data privacy. Familiarity with data management principles and practices in DMBOK. Experience in creating frameworks, policies and processes. Strong experimental mindset to drive innovation amidst uncertainty and solving problems. Strong experience in process improvements, handson operational management, and change management. Preferred skill sets Certifications in Data Governance or related fields (e.g., DAMACDMP, DCAM, CDMC) or any Data Governance tool certification Experience in other Data Governance Tool like Collibra, Talend, Microsoft Purview, Atlan, Solidatus etc. Experience in working on RFPs, internal/external POVs, Accelerators and other Years of experience required 47 Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills Informatica Cloud Data Governance & Catalog (CDGC) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} Travel Requirements Government Clearance Required?
Posted 1 week ago
6.0 - 11.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s Develop and implement data governance solutions using Informatica CDGC. Configure and manage metadata ingestion, lineage, and data cataloging functionalities. Collaborate with data stewards to define and enforce data governance policies and standards. Design and implement data quality rules and metrics to monitor and improve data accuracy. Integrate CDGC with other enterprise systems and data sources for seamless metadata management. Work with business users to capture and maintain business glossaries and data dictionaries. Conduct data profiling and analysis to support data governance initiatives. Provide training and support to users on leveraging CDGC for data governance and cataloging. Participate in solution design reviews, troubleshooting, and performance tuning. Stay updated with the latest trends and best practices in data governance and cataloging. Mandatory skill sets 6+ years of experience in data governance and cataloging, with at least 1 year on the Informatica CDGC platform. Proficiency in configuring and managing Informatica CDGC components. Ability to integrate CDGC with various data sources and enterprise system Experience in debugging issues & applying fixes in Informatica CDGC In depth understanding of Data Management landscape including technology landscape, standards, and best practices prevalent in data governance, metadata management, cataloging, data lineage, data quality, and data privacy. Familiarity with data management principles and practices in DMBOK. Experience in creating frameworks, policies and processes. Strong experimental mindset to drive innovation amidst uncertainty and solving problems. Strong experience in process improvements, handson operational management, and change management. Preferred skill sets Certifications in Data Governance or related fields (e.g., DAMACDMP, DCAM, CDMC) or any Data Governance tool certification Experience in other Data Governance Tool like Collibra, Talend, Microsoft Purview, Atlan, Solidatus etc. Experience in working on RFPs, internal/external POVs, Accelerators and other Years of experience required 710 Education qualification B.Tech / M.Tech / MBA / MCA Education Degrees/Field of Study required Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred Required Skills Informatica Cloud Data Governance & Catalog (CDGC) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Coaching and Feedback, Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy {+ 22 more} Travel Requirements Government Clearance Required?
Posted 1 week ago
5.0 - 10.0 years
10 - 11 Lacs
Hyderabad
Work from Office
As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical inquiries regarding the use of and fix for our Electronic Support Services. A main point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Around 5+ Years experience as Oracle Database engineer/ Core DBA - Clear understanding of Oracle Database Architecture - including the latest version of Oracle 12c & 19c) - knowledge and good experience in the following areas RAC, ASM, Exadata, Performance tuning, Data Guard (Physical and Logical), DG Broker - Experience in handling database recovery scenarios - Experience in solve various installation and patching issues - Exposure & good knowledge in Cloud technologies- Oracle Cloud Infrastructure (OCI/ExaCC), AWS and MS Azure will be an added advantage. - Ability to work under pressure - quick thinking and remaining calm during stressful situations. - Ability to quickly grasp complex technical issues. - Excellent written and verbal communication skills. - Willingness to work in shifts (including Night shifts as part of 24X7 rota) and on weekends . - Appetite to learn new technologies and constantly improve technical skills - Good academic background and Oracle Certifications is a must. An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. Career Level - IC2 As a Systems Engineer, you will interface with the customers IT staff on a regular basis. Either at the clients site or from a remote location, you will be responsible for resolution of moderately complex technical problems related to the installation, recommended maintenance and use and repair/workarounds for Oracle products. You should have knowledge of some Oracle products and one platform that is being supported. You will be expected to work with only general guidance from senior engineers and management and, in some areas may work independently. Around 5+ Years experience as Oracle Database engineer/ Core DBA - Clear understanding of Oracle Database Architecture - including the latest version of Oracle 12c & 19c) - knowledge and good experience in the following areas RAC, ASM, Exadata, Performance tuning, Data Guard (Physical and Logical), DG Broker - Experience in handling database recovery scenarios - Experience in troubleshooting various installation and patching issues - Exposure & good knowledge in Cloud technologies- Oracle Cloud Infrastructure (OCI/ExaCC), AWS and MS Azure will be an added advantage. - Ability to work under pressure - quick thinking and remaining calm during stressful situations. - Ability to quickly grasp complex technical issues. - Excellent written and verbal communication skills. - Willingness to work in shifts (including Night shifts as part of 24X7 rota) and on weekends . - Appetite to learn new technologies and constantly improve technical skills - Good academic background and Oracle Certifications is a must. An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions.
Posted 1 week ago
15.0 - 20.0 years
30 - 35 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organizations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director, Software Engineering Specialist In this role you will: Design, build and implement virtualization solutions on VMWare technologies and Develop and maintain VMware vSphere and vCenter server environments . Key Responsibilities: Troubleshoot and resolve complex environmental issues to ensure the availability, reliability and scalability of virtualization infrastructure. Perform capacity planning and performance tuning of virtualization infrastructure. Collaborate with the wider design and architecture teams to contribute to solution designs and develop engineering standards. Provide technical guidance and mentorship to junior engineers and administrators. Develop and implement disaster recovery and the business continuity plans for virtualization infrastructure. Monitor virtualization infrastructure to ensure optimal performance and identify the opportunities for improvement deliver solutions to maintain reliability of shared platform. Define, deploy and manage processes and tools for continuous integration (CI/CD), test-driven development, and release management. Maintain customer focused approach, ensuring that all solutions and services meet or exceed customer expectations Requirements To be successful in this role you should meet the following requirements: Minimum bachelor s degree 15+ Years of total IT experience with 12+ years of relevant experience with VMware platform B.Tech/M.Tech/M.Sc in Computer Science/IT preferred (or any engineering field considered) or equivalent Expertise in managing and maintaining VMware products (vSphere, vCenter, vROPS, vRO, vRLI, vRLCM) Proven ability to design, implement and manage automation using Ansible or Puppet / vRO workflows in large scale virtualization infrastructure Excellent troubleshooting and problem solving skills for VMware related issues Experience in scripting and automation using PowerShell, Java, Python, or other scripting languages. Strong understanding of network and storage technologies in relation to virtualization platform Certification in VMware technology such as VCP or VCAP or VCI is preferred Ability to work collaboratively in team environment and communicate effectively with technical and non-technical stakeholders Expertise in cluster management, including configuration of High Availability(HA), Distributed Resource Scheduler(DRS), and Affinity/Anti-Affinity rules to ensure maximum service availability and system stability. You ll achieve more when you join HSBC. HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working, and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Posted 1 week ago
3.0 - 6.0 years
16 - 18 Lacs
Gurugram
Work from Office
Most companies try to meet expectations, dunnhumby exists to defy them. Using big data, deep expertise and AI-driven platforms to decode the 21 st century human experience - then redefine it in meaningful and surprising ways that put customers first. Across digital, mobile and retail. For brands like Tesco, Coca-Cola, Procter & Gamble and PepsiCo. We re looking for a Big Data Engineer to join our dh strategic team of Loyality and Personalisation which builds products the retailer can use to find the optimal customer segments and send personalised offers and digital recommendations to the consumer. These products are strategic assets to retailer to improve the loyality of their consumer. by that these products are very important for retailers and therefore for dunnhumby What we expect from you: - 3 to 6 years of experience in software development using Python. - Hands on experience in Python OOPS, Design patterns, Dependency Injection, data libraries(Panda), data structures - Exposure to Spark: PySpark, Architecture of Spark, Best practices to optimize jobs - Experience in Hadoop ecosystem: HDFS, Hive, or YARN - Experience of Orchestration tools: Airflow, Argo workflows, Kubernetes - Experience on Cloud native services(GCP/Azure/AWS) preferable GCP - Database knowledge of: SQL, NoSQL - Hands on expsoure to CI/CD pipelines for data engineering workflows - Testing: pytest for unit testing. pytest-spark to create a test Spark Session. Spark UI for Performance tuning & monitoring Good to have: - Scala What you can expect from us We won t just meet your expectations. We ll defy them. So you ll enjoy the comprehensive rewards package you d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don t just talk about diversity and inclusion. We live it every day - with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One and dh Thrive as the living proof. Everyone s invited. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) What you can expect from us We won t just meet your expectations. We ll defy them. So you ll enjoy the comprehensive rewards package you d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don t just talk about diversity and inclusion. We live it every day - with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here)
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
We are seeking a skilled and detail-oriented .NET Developer to join our development team. You will be responsible for designing, coding, testing, and maintaining scalable web and desktop applications using the .NET framework and related technologies. Key Responsibilities: Design, develop, test, and deploy .NET applications. Write clean, scalable, and efficient code using C# and .NET Core / .NET Framework. Build and maintain APIs and web services (RESTful / SOAP). Collaborate with cross-functional teams including UI/UX designers, business analysts, and QA engineers. Maintain code quality and perform regular code reviews. Troubleshoot and debug applications. Work with databases such as SQL Server or MySQL. Ensure the best possible performance, quality, and responsiveness of applications. Required Skills: .NET Core / .NET Framework ASP.NET MVC Web API Entity Framework Experience with SQL Server , including stored procedures and performance tuning Familiarity with version control systems like Git or TFS Strong understanding of object-oriented programming and design patterns Preferred Skills (Nice to Have): Microservices architecture CI/CD pipelines Agile/Scrum methodologies Familiarity with front-end frameworks (e.g., Angular, React) Qualifications: Bachelor s degree in Computer Science, IT, or a related field. [2-5+] years of relevant experience in .NET development. Apply for this position Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. * Recaptcha requires verification. Im not a robot How can we help you ?
Posted 1 week ago
1.0 - 4.0 years
8 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES Provides full design, planning, configuration, documentation, deployment and top-level support ownership of storage infrastructure technologies. Identifies design requirements and makes recommendations for capacity planning, performance optimization and future direction. Designs storage solutions per business requirements. This includes performing storage workload modeling for sizing, optimization and troubleshooting. Researches and compares system/OS features and works with vendors on system sizing for specific applications. Understands storage virtualization, data rationalization, workload automation, storage provisioning, Disaster Recovery and SAN Fabric management. Troubleshoots storage-related reliability, availability, and performance issues. Collaborates on and implements architecture recommendations to application integration, system administration, problem management, preventive maintenance, performance tuning. Identifies and eliminates performance bottlenecks and makes performance-related recommendations (hardware, software, configuration). Leads or participates in the software development lifecycle, which includes research, new development, modification, security, correction of errors, reuse, re-engineering and maintenance of software products. Manages or utilizes software that is built and implemented as a product, using best-in-class development process/lifecycle management (ex: Agile, Waterfall). Gathers business requirements and participates in product definition and feature prioritization, including customer usability studies. Performs competitive analysis for features at a product level scope. Leads the testing and fixing of new or enhanced products. Creates technical documentation of software products/solutions. Assists with the development and review of end user and technical end user documentation. Drives idea generation for new software products, or for the next version of an existing product. Protects Intellectual property by working appropriate legal elements (ex: procurement, patents, open source). Responsible for the delivery of products within budget, schedule and quality guidelines. Works with the team to develop, maintain, and communicate current development schedules, timelines and development status. Makes changes to system software to correct errors in the original implementation and creates extensions to existing programs to add new features or performance improvements. Designs and develops major functional or performance enhancements for existing products, or produces new software products or tools. Reviews requirements, specifications and designs to assure product quality; develops and implements plans and tests for product quality or performance assurance. RESPONSIBILITIES Participates in the preparation, review and analysis of software/storage requirements and specifications Prepares written specifications from verbal requirements for tasks of mid-level complexity Prepares design, functional, technical and/or user documentation, as needed Uses defined software lifecycle methodologies Reviews and implements test strategies for software products Follows source code and file revision control for projects Clearly communicates project issues and status Accurately logs project schedule, defect, and other data Analyzes and prepares trend reports on quality metrics Participates in improving product quality through process and procedure improvements Enable Skills-Based Hiring No Description Comments Additional Details Description Comments : 1-4 yrs of strong hands-on experience in Manual and Functional testing along with Automation (in Python or C Sharp).Must have basic Hardware Knowledge (Laptop, Desktops, Servers, Peripherals like Mouse, Keyboard, audio device, monitor etc). AI Knowledge is a good to have.Good to have (Security and performance testing). Not to Exceed Rate : (No Value)
Posted 1 week ago
1.0 - 4.0 years
8 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES Provides full design, planning, configuration, documentation, deployment and top-level support ownership of storage infrastructure technologies. Identifies design requirements and makes recommendations for capacity planning, performance optimization and future direction. Designs storage solutions per business requirements. This includes performing storage workload modeling for sizing, optimization and troubleshooting. Researches and compares system/OS features and works with vendors on system sizing for specific applications. Understands storage virtualization, data rationalization, workload automation, storage provisioning, Disaster Recovery and SAN Fabric management. Troubleshoots storage-related reliability, availability, and performance issues. Collaborates on and implements architecture recommendations to application integration, system administration, problem management, preventive maintenance, performance tuning. Identifies and eliminates performance bottlenecks and makes performance-related recommendations (hardware, software, configuration). Leads or participates in the software development lifecycle, which includes research, new development, modification, security, correction of errors, reuse, re-engineering and maintenance of software products. Manages or utilizes software that is built and implemented as a product, using best-in-class development process/lifecycle management (ex: Agile, Waterfall). Gathers business requirements and participates in product definition and feature prioritization, including customer usability studies. Performs competitive analysis for features at a product level scope. Leads the testing and fixing of new or enhanced products. Creates technical documentation of software products/solutions. Assists with the development and review of end user and technical end user documentation. Drives idea generation for new software products, or for the next version of an existing product. Protects Intellectual property by working appropriate legal elements (ex: procurement, patents, open source). Responsible for the delivery of products within budget, schedule and quality guidelines. Works with the team to develop, maintain, and communicate current development schedules, timelines and development status. Makes changes to system software to correct errors in the original implementation and creates extensions to existing programs to add new features or performance improvements. Designs and develops major functional or performance enhancements for existing products, or produces new software products or tools. Reviews requirements, specifications and designs to assure product quality; develops and implements plans and tests for product quality or performance assurance. RESPONSIBILITIES Participates in analysis of software/storage requirements and specifications Participates in the creation of technical and/or user documentation, as needed Develops, tests and integrates code for new or existing software Provides sustaining or maintenance support to existing software/storage environment Follows source code revision control Clearly communicates project issues and status Provides constructive and responsive customer service to business partners Accurately logs project schedule, defects, and other data into the appropriate databases Enable Skills-Based Hiring No Description Comments Additional Details Description Comments : 1-4 yrs of strong hands-on experience in Manual and Functional testing along with Automation (in Python or C Sharp). Must have basic Hardware Knowledge (Laptop, Desktops, Servers, Peripherals like Mouse, Keyboard, audio device, monitor etc). AI Knowledge is a good to have. Good to have (Security and performance testing). Not to Exceed Rate : (No Value)
Posted 1 week ago
5.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Master of Business Administration, Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?
Posted 1 week ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration, Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 1 week ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 1 week ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 1 week ago
5.0 - 8.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s About the Role We are hiring sharp, handson Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do Build and manage highperformance data pipelines for batch and near realtime use cases Write optimized, complex SQL queries and stored procedures for analytics and reporting Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy Optimize queries and scripts over largescale datasets (TBs) with a focus on speed and efficiency Build versioned, testable data models using DBT Orchestrate multistep workflows with Apache Airflow Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets Must have knowledge, skills and experiences 5+ years of handson experience in Data Engineering Strong command over SQL and Python, especially for transformation and automation Deep experience with DBT and Airflow in production environments Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning Strong analytical thinking and debugging skills Preferred skill sets Good to have knowledge, skills and experiences Experience with Teradata and Starburst (Presto/Trino) Familiarity with cloud platforms (Azure/GCP/Snowflake) Exposure to onprem to cloud data migrations Knowledge of Gitbased workflows and CI/CD pipelines Years of experience required Experience 58 years Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} No
Posted 1 week ago
5.0 - 8.0 years
20 - 25 Lacs
Hyderabad
Work from Office
& Summary . In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive datadriven decisionmaking. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Position Azure Cloud Migration Engineer / Senior Associate Experience 58 Years Location Hyderabad Type Full Time Certifications Azure Solutions Architect Expert About the Role We are seeking a handson Data Engineer with 58 years of experience specializing in onpremises database migration to cloud. The ideal candidate will have a strong background in cloud migration whether onpremises to Azure cloud, or cloud to Azure cloud. Key Responsibilities Should have experience of database migration strategy and related technologies. Experience of Microsoft SQL DB migration from onprem to cloud platform. Experience of any other cloud (e.g. AWS or GCP) to Azure cloud platform. Review Database migration plans, provide recommendation and liaise with Customer and Migration Factory for successful migrations. Ability to identify and resolve performance issues postmigration Required Skills Knowledge in using tools like Azure Data Studio, Azure DMA, Azure DMS, Azure SQL/MI Familiarity with migrating SSIS packages using ADF. SSMA Migration Tool, Database migration using SSIS Packages Indepth knowledge of SQL, TSQL and programming experience with stored procedures, triggers, functions, etc. Stronger performance tuning (partitioning, exec plans, indexing etc.), data modeling experience and dacpac deployment experience, Stored Procedures Optimization Extensive experience with Azure Data Services, including Azure Data Factory, Azure SQL Data Warehouse, and Synapse Analytics. Strong understanding of database architecture principles and best practices. Preferred Qualifications Azure Solutions Architect Expert certification. Knowledge of SQL Clustering/Always on Availability Groups, Azure SQL, SQL MI Experience of CTS/Migration technologies. (Rehost/Clean Deployments) Should have exp on Azure project and GitHib as Code Repo Knowledge of setting up High Availability & Disaster Recovery configuration. Experience in setting up Always ON Availability Groups & Zone Redundant Storage Familiarity with Azure DevOps (ADO) for (CI/CD) pipelines, Infrastructure as Code (IaC) tools like Terraform for deploying container apps and associated services Knowledge of security best practices and compliance requirements for cloudbased applications Backup/recovery, Security related activities, Audit tracking and reporting Maintaining and supporting, monitoring production and test environments Implementing of High Availability such as Clustering, Database Mirroring, Log Shipping, and Replication Mandatory skill sets Knowledge in using tools like Azure Data Studio, Azure DMA, Azure DMS, Azure SQL/MI Familiarity with migrating SSIS packages using ADF. SSMA Migration Tool, Database migration using SSIS Packages Indepth knowledge of SQL, TSQL and programming experience with stored procedures, triggers, functions, etc. Additional skill sets preferred Azure Solutions Architect Expert certification. Knowledge of SQL Clustering/Always on Availability Groups, Azure SQL, SQL MI Experience of CTS/Migration technologies. (Rehost/Clean Deployments) Should have exp on Azure project and GitHib as Code Repo Years of experience required Experience 58 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytical Thinking, Analytic Research, Big Data, Business Data Analytics, Coaching and Feedback, Communication, Complex Data Analysis, Conducting Research, Creativity, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, DataDriven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling {+ 43 more} No
Posted 1 week ago
3.0 - 4.0 years
37 - 45 Lacs
Bengaluru
Work from Office
We are seeking a skilled, experienced database administrator to join our organization. In this position, you will manage all our Dev and QA databases. This person will be responsible to maintain the environments, keep them healthy , work with Oracle Support to resolve any issues, collaborate with application teams to make sure application activity does not disrupt DB health. An organized, detail-oriented work ethic is a must. Duties and Responsibilities Monitor databases for proper performance Install upgrades and maintain systems Manage storage for all applications Manage security and report incidents Maintain server configuration, migration, and other implementations Monitor root cause analysis and resolutions Install and maintain servers and environments Manage back-ups and recovery procedures Create custom monitoring and maintenance plans Administer and test new upgrades and databases Troubleshoot and resolve database problems Ensure audit trails are maintained and documented Requirements and Qualifications Bachelors degree in computer science, information technology, computer science, engineering, or related field Three to four years of SQL server database experience or database administration experience in a commercial environment or equivalent work experience Experience with any of the following: Oracle upto version 12c (23ai will be beneficial), GoldenGate, SQL, PL/SQL, Oracle OEM, UNIX Experience with database migration, performance tuning, and optimization, and setting up and managing database connections Able to multitask, prioritize, and manage time efficiently Accurate and precise attention to detail Strong written and verbal communication skills Excellent analytical, quantitative, and organizational skills Up-to-date on the latest industry trends; able to articulate trends and potential clearly and confidently Good interpersonal skills and communication with all levels of management
Posted 1 week ago
8.0 - 11.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Job Title: PostgreSQL Database Developer Responsibilities: Design, develop, and maintain PostgreSQL databases. Optimize database performance and ensure data integrity. Write complex SQL queries, stored procedures, and triggers. Collaborate with software developers to integrate database solutions. Perform database tuning and troubleshooting. Implement and manage database backup, recovery, and security measures. Monitor database performance and provide technical support. Stay updated with the latest PostgreSQL features and best practices. Requirements: Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a PostgreSQL Database Developer or similar role. Strong proficiency in SQL and PostgreSQL. Experience with database design, optimization, and performance tuning. Familiarity with database security and backup/recovery processes. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Good communication skills. Preferred Qualifications: Experience with other database systems (e.g., MySQL, Oracle). Knowledge of programming languages such as Python, Java, or C#. Experience with cloud-based database solutions (e.g., AWS RDS, Azure SQL Database). EXPERIENCE 8-11 Years SKILLS Primary Skill: Database Administrator Sub Skill(s): Database Administrator Additional Skill(s): C#, Python, postgreSQL DBA, Oracle DBA, SQL Server DBA
Posted 1 week ago
8.0 - 12.0 years
27 - 30 Lacs
Bengaluru
Work from Office
1. Strong development knowledge in DB Design & development with 6+ Years to 10 Years experience (Postgres DB) Mandatory 2. Strong hands on writing complex PGSQL, procedure and Functions & prevent blocking and Deadlocks 3. Conduct SQL objects code review & Performance tuning( Mandatory ) 4. having hands on Microsoft SQL and MYSQL DB is an advantage. 5. Strong knowledge in RDBMS and NoSQL Concept with strong logical thinking and solutions (Highly required) 6. Expert in transaction databases (OLTP) and ACID property with handling large scale application databases(Mandatory) 8. Consult application developer to provide suggestion / SQL's / PGSQLs on DB with best solutions 9. Good Communications and written skills
Posted 1 week ago
6.0 - 8.0 years
37 - 40 Lacs
Kochi, Hyderabad, Coimbatore
Work from Office
6+ years of experience in data engineering / warehousing, with at least 2+ years in BigQuery and GCP. Strong expertise in SQL query optimization, BigQuery scripting, and performance tuning. Hands-on experience with ETL/ELT tools like Cloud Dataflow (Apache Beam), Cloud Composer (Airflow), dbt, Talend, Matillion, or Informatica IICS. Experience with Cloud Storage, Pub/Sub, and Dataflow for real-time and batch data ingestion. Proficiency in Python or Java for scripting and data processing tasks. Experience with semi-structured data (JSON, Avro, Parquet) and BigQuery ingestion methods. Familiarity with CI/CD pipelines, Terraform, Git, and Infrastructure as Code (IaC). Strong understanding of data governance, security policies, and compliance standards in GCP. Experience working in Agile/Scrum environments and following DevOps practices.
Posted 1 week ago
1.0 - 3.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Key Responsibilities: Provide L2/L3 technical support for all hospital applications, ensuring minimal downtime and fast issue resolution. Troubleshoot and resolve application/database-related incidents reported by clinical and non-clinical users. Work closely with application vendor and internal IT teams for issue escalation and application patch management. Perform regular MS SQL database health checks, backups, optimization, and performance tuning. Write and maintain SQL scripts, queries, and stored procedures for data extraction and reporting. Participate in UAT (User Acceptance Testing) for application upgrades and deployments. Maintain documentation for system configurations, workflows, and troubleshooting steps.
Posted 1 week ago
2.0 - 5.0 years
8 - 11 Lacs
Bengaluru
Work from Office
Interested in working for an international and diverse company Interested in developing your career in the water industry Interested in working for a company that is dedicated to sustainability If so, read on! Protecting water, the most valuable resource, and driving sustainability is very close to our hearts You will be part of a flexible, family friendly organization that cares about its people just as it cares about the environment We Offer Flexible working hours Professional onboarding and training options Powerful team looking forward to working with you Career coaching and development opportunities Health benefits In this role as a Data Scrubber, you will be responsible for driving incoming leads from various channels to revenue This involves technically and commercially qualifying leads and then handing over the qualified leads to the appropriate sales associate By doing so, sales efficiency improves as the team can focus better on qualified leads, while unqualified leads are filtered out at the Lead Qualifier level This position works closely with the Lead Qualification Manager and Marketing Manager The Lead Qualifier is responsible for qualifying leads and routing qualified opportunities to the appropriate sales executives for further development and closure They will also take ownership of low-value opportunities and close them This role requires close interaction with various functions such as the field sales team, logistics, and finance Additionally, the Lead Qualifier provides insights into the types of leads received to help plan campaigns better The Lead Qualifier will collaborate with the Global Lead Qualification team to understand and implement best practices across geographies and suggest improvements if needed The ideal candidate is energetic and experienced in understanding customer requirements and suggesting the right products The Data Scrubber assists the Marketing team with database scrubbing activities They will scrub the SFDC database and third-party databases to provide the inside sales team with a refined list of contacts for cold/warm calling The Data Scrubber works collaboratively with the Marketing, Lead Qualification Manager, and Inside Sales team to assist with database scrubbing The ideal candidate should have a good telephonic presence, be energetic, and proactive in handling tasks This position is part of the Lead Qualifier & Data Scrubber Team located in Bangalore (Hybrid Mode) In This Role, a Typical Day Will Look Like Handle leads coming from marketing campaigns, website, inbound calls and emails Monitor open leads to opportunities to closure, work closely with sales channels Manage the Leads process, Own the Leads Qualification (BANT) , Qualify the leads & Route qualified opportunities to the appropriate sales executives for further development and closure Develop an understanding of competitive products Develop customer quotations Populate and maintain the highest standards of data integrity in Salesforce com Work closely with field sales team, logistics, and Finance department Collect and provide constructive feedback to cross functional lead sources to drive continuous improvement of process Demonstrate technical and application knowledge to provide prompt, accurate answers and successfully qualify the leads, Follow standard work to contact and convert leads to opportunities per sales cadence The Essential Requirements Of The Job Include Language expertise English Salesforce com database optimization-Manage contact master database in SFDC, update Market codes, update Visibility metrics Call SFDC database and check for data accuracy and update the data if its outdated Call third party list provided by Marketing department and check whether Hach products fits into their portfolio If it does, then capture in a document Reports the progress of the calls daily with Lead Qualification Marketing Manager WATER QUALITY PLATFORM Water Quality (WQ) Platform is part of the Environmental & Applied Solutions reporting segment and is a global leader in water quality analysis and treatment, providing instrumentation and disinfection systems to help analyze and manage the quality of ultra-pure water, portable water, wastewater, groundwater and ocean water in residential, commercial, industrial, and natural resource applications Our water quality business provides products under a variety of brands, including Hach, Trojan Technologies, McCrometer and ChemTreat WQ Asia has sales offices in India, Australia, New Zeland, Singapore, South Korea, Thailand, Malaysia, Indonesia, Vietnam and Philippines At Hach ( www hach com ), we ensure water quality for people around the world, and every associate plays a vital role in that mission Our founding vision is to make water analysis betterfaster, simpler, greener and more informative We accomplish this through teamwork, customer partnerships, passionate experts, and reliable, easy-to-use solutions As part of our team, youll make an immediate, measurable impact on a global scale by enabling the worlds everyday water needs Youll also belong to a respectful and collaborative community that fosters career growth and professional development Youll be supported by resources that make a positive difference in your life because, at Hach, we value your authenticity and want your talents to shine Motivated by the highest possible stakes of climate change and global health, were working together within a rapidly digitizing industry to find innovative technologies that guarantee the safety of our water and our environment More about us: https://www hach com/about-us Hach is proud to be a Water Quality company in Veralto (NYSE: VLTO) Imagine a world where everyone has access to clean water, safe food and medicine, and trusted essential goods That is the tomorrow Veralto is creating today Veralto is a $5B global leader in essential technology solutions made up of over 16,000 associates across our Water Quality and Product Identification segments all united by a powerful purpose: Safeguarding the Worlds Most Vital Resources At Veralto, we value diversity and the existence of similarities and differences, both visible and not, found in our workforce, workplace and throughout the markets we serve Our associates, customers and shareholders contribute unique and different perspectives as a result of these diverse attributes If youve ever wondered whats within you, theres no better time to find out Unsolicited Assistance We do not accept unsolicited assistance from any headhunters or recruitment firms for any of our job openings All resumes or profiles submitted by search firms to any employee at any of the Veralto companies (https://www veralto com/our-companies/) , in any form without a valid, signed search agreement in place for the specific position, approved by Talent Acquisition, will be deemed the sole property of Veralto and its companies No fee will be paid in the event the candidate is hired by Veralto and its companies because of the unsolicited referral Veralto and all Veralto Companies are committed to equal opportunity regardless of race, color, national origin, religion, sex, age, marital status, disability, veteran status, sexual orientation, gender identity, or other characteristics protected by law We value diversity and the existence of similarities and differences, both visible and not, found in our workforce, workplace and throughout the markets we serve Our associates, customers and shareholders contribute unique and different perspectives as a result of these diverse attributes
Posted 1 week ago
7.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Job Description Company Overview: Schneider Electric is a global leader in energy management and automation, committed to providing innovative solutions that ensure Life Is On everywhere, for everyone, and at every moment. We are expanding our team in Gurugram and looking for a Senior Cloud Architect to enhance our cloud capabilities and drive the integration of digital technologies in our operations. As a Principal Technical Expert at Schneider Electric, you will play a crucial role in developing and implementing IoT solutions across our global infrastructure, with a primary focus on Edge software. This position requires a blend of strategic architectural design and practical hands-on ability to implement and manage , and optimize edge-based software solutions, ensuring efficient data processing for a large-scale edge gateways and devices (100s of thousands) deployed in the field. Key Responsibilities: Architect and develop scalable, high-performance Edge computing solutions for IoT applications. Lead the design and implementation of asynchronous task processing using Python (asyncio, Twistd, Tornado, etc.) for efficient data handling and device communication. Develop and optimize IoT data pipelines, integrating sensors, edge devices, and cloud-based platforms. Collaborate with cross-functional teams to define edge computing strategies, system architectures, and best practices. Work on device-to-cloud communication using MQTT, WebSockets, or other messaging protocols. Ensure software is secure, reliable, and optimized for resource-constrained edge environments. Design and optimize Linux-based networking for edge devices, including network configuration, VPNs, firewalls, and traffic shaping. Implement and manage Linux process management, including systemd services, resource allocation, and performance tuning for IoT applications. Conduct code reviews, mentor junior developers, and provide technical leadership in edge software development. Stay updated with emerging IoT, edge computing, and Linux networking technologies. Requirements: Technical 7 - 10 years of overall experience in software engineering with a strong focus on Python development. Expertise in Python, with experience in asynchronous programming, task processing frameworks, Web frameworks (e.g., asyncio, Twistd, FastAPI, Flask). Strong knowledge of Linux networking, including TCP/IP, DNS, firewalls (iptables/nftables), VPNs, and network security. Experience in Linux process management, including systemd, resource limits (cgroups), and performance tuning. Good Understanding of IoT architectures, protocols (MQTT, HTTP/REST), and edge computing frameworks. Hands-on experience with Docker. Proficiency and Experience with Git or any other VCS. Excellent problem-solving skills and the ability to lead complex technical projects. Good to have: Knowledge of Rust, C++, or Golang for performance-critical edge applications. Prior experience of working in IoT. Understanding of BACnet/Modbus protocols. Familiarity with cloud IoT platforms (AWS IoT, Azure IoT, Google Cloud IoT) and their integration with edge devices. Soft Skills: Excellent problem-solving abilities and strong communication skills. Advanced verbal and written communication skills including the ability to explain and present technical concepts to a diverse set of audiences. Comfortable working directly with both technical and non-technical audiences Good judgment, time management, and decision-making skills Strong teamwork and interpersonal skills; ability to communicate and thrive in a cross-functional environment A guardian against technical debt, ensuring our legacy remains pristine. Willingness to work outside documented job description. Has a whatever is needed attitude. Qualifications Preferred Qualifications: Bachelors or Masters degree in computer science, Information Technology, or related field. Working experience on designing robust, scalable & maintainable asynchronous python applications. Prior experience in building cloud connected Edge IoT solutions Prior experience in the energy sector or industrial automation is advantageous. Schedule: Full-time Req: 009C76
Posted 1 week ago
0.0 years
1 - 1 Lacs
Hyderabad
Work from Office
We are looking for an experienced High-Performance Computing (HPC) Engineer to design, develop, and optimize computational systems and software for large-scale data processing, simulations, and advanced analytics. The ideal candidate will have strong programming skills in C/C++ and experience with parallel computing, performance tuning, and HPC infrastructure. Key Responsibilities: Develop and maintain high-performance software using C/C++, focusing on parallel and distributed systems. Optimize code for multi-core CPUs and GPU architectures using OpenMP, MPI, CUDA, or OpenCL. Profile, benchmark, and debug performance-critical code using tools like gprof, Valgrind, VTune, or nvprof. Collaborate with researchers and developers to adapt scientific algorithms for scalable computing. Work with Linux-based systems to manage HPC environments and integrate job schedulers such as SLURM or PBS. Contribute to the design and scaling of HPC systems and clusters. Support users in porting, debugging, and optimizing applications on HPC platforms. Required Skills: Proficiency in C/C++ with strong fundamentals in memory management and performance optimization. Hands-on experience with parallel computing frameworks: OpenMP, MPI, CUDA, or OpenCL. Knowledge of system-level programming and Linux development environments. Familiarity with profiling and debugging tools for both CPU and GPU. Understanding of computer architecture, concurrency, cache optimization, and SIMD/vectorization. Experience with version control systems such as Git. Preferred Qualifications: Familiarity with scientific computing libraries (e.g., Eigen, LAPACK, cuBLAS, PETSc, Trilinos). Experience working with HPC clusters, file systems (Lustre, GPFS), and InfiniBand or high speed interconnects. Exposure to real-time or robotics simulation environments. Knowledge of numerical analysis and floating-point computation issues. Education: Bachelor's or Master's degree in Computer Science, Electrical Engineering, Physics, or a related technical field.
Posted 1 week ago
8.0 - 13.0 years
2 - 7 Lacs
Hyderabad
Work from Office
Required Expertise: Kernel Programming: Strong knowledge of Linux storage subsystems (block layer, VFS, I/O stack). Proficiency in C and kernel debugging techniques. Storage Protocols & Interfaces: Hands-on with eMMC, UFS, NVMe, USB mass storage, SATA, SPI-NAND/NOR, SDIO, etc. Understanding of storage standards (SCSI, AHCI, NVMe spec, JEDEC). Filesystems: Deep knowledge of ext4, f2fs, and familiarity with log-structured or flash-optimized file systems. Performance & Tuning: Expertise in tuning I/O performance and handling flash-specific issues (latency, endurance, etc.). Tools: blktrace, iostat, fio, perf, gdb, crash, etc. Security: Secure storage handling, key management, dm-verity/dm-crypt, rollback protection. Yocto/Build Systems (optional but useful): Understanding of build flows for embedded Linux using Yocto or Buildroot. Except Sunday we have everyday walk-in
Posted 1 week ago
3.0 - 8.0 years
12 - 13 Lacs
Mumbai
Work from Office
Build Data applications with high accuracy and performance across traditional and distributed computing platforms. Design, build, and maintain high performance, reusable, and reliable code quality and features being delivered efficiently and on-time. Document everything. Develop database processes, gather, and process raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries in MySQL, handle data cloud etc). Administer data processing workflows associated with tools like MySQL, Oozie, Zookeeper, Sqoop, Hive, Impala for data processing across the distributed platform. Work closely with our engineering team to integrate your amazing innovations and algorithms into our production systems. Support business decisions with ad hoc analysis as needed and troubleshoot production issues and identify practical solutions. Routine check-up, back-up and monitoring of the entire MySQL and Hadoop ecosystem. Take end-to-end responsibility of the Traditional Databases (MySQL), Big Data ETL, Analysis and processing life cycle in the organization and manage deployments of bigdata clusters across private and public cloud platforms. Required Skills: 4+ years of experience with SQL (MySQL) a must. 2+ years of Hands-on experience working with Cloudera Hadoop Distribution platform and Apache Spark. Strong understanding of full dev life cycle, for backend database applications across RDBMS and distributed cloud platforms. Experience as a Database developer writing SQL queries, DDL/DML statements, managing databases, writing stored procedures, triggers and functions and knowledge of DB internals. Knowledge of database administration, performance tuning, replication, backup, and data restoration. Comprehensive knowledge of Hadoop Architecture and HDFS, to design, develop, document and architect Hadoop applications. Working knowledge of SQL, NoSQL, data warehousing & DBA along with Map-Reduce, Hive, Impala, Kafka, HBase, Pig, and Java. Experience processing large amounts of structured and unstructured data, extracting, and transforming data from remote data stores, such as relational databases or distributed file systems. Working expertise with Apache Spark, Spark streaming, Jupyter Notebook, Python or Scala programming. Excellent communication skills, ability to tailor technical information for different audiences. Excellent teamwork skills, ability to self-start, share insights, ask questions, and report progress. Working knowledge of the general database architectures, trends, and emerging technologies. Familiarity with caching, partitioning, storage engines, query performance tuning, indexes, and distributed computing frameworks. Working knowledge & understanding of data analytics or BI tools - like looker studio, Power BI, or any other BI tools is a must. Additional Desired Skills: Added advantage if you have exposure to advance technology components like - caching techniques, load balancers, distributed logging, distributed queries, queueing engines, containerization, html/CSS optimization, mobile app & web server optimization, cloud services. Strong attention to detail on every line of code, every unit test, and every commit message. Comfortable with rapid development cycles and tight schedules. Experience with Linux, GitHub, Jira is a plus. Good experience with benchmarking, optimization, and CI/CD pipeline. Experience with web paradigms such as REST, Responsive Web Design, Test-driven Development (TDD), Dependency Injection, unit testing frameworks such JUnit, etc bachelors degree or higher in Computer Science with relevant skills in mobile application development and web.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
24189 Jobs | Dublin
Wipro
12931 Jobs | Bengaluru
EY
9307 Jobs | London
Accenture in India
8065 Jobs | Dublin 2
Amazon
7645 Jobs | Seattle,WA
Uplers
7501 Jobs | Ahmedabad
IBM
7123 Jobs | Armonk
Oracle
6823 Jobs | Redwood City
Muthoot FinCorp (MFL)
6162 Jobs | New Delhi
Capgemini
5226 Jobs | Paris,France