Jobs
Interviews

123 Sql Optimization Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

Total Experience 3 to 4 years Relevant Experience on Mandatory Skills 3 years Resource will work on PL/SQL. He should have hands on experience in understanding/writing complex SQL queries /SP. Should have working experience in SQL project . Mandatory Skills PL/SQL Good to Have Skills CONTROL M or SSRS

Posted 1 month ago

Apply

3.0 - 4.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

Total Experience 3 to 4 years Relevant Experience on Mandatory Skills 3 years Resource will work on PL/SQL. He should have hands on experience in understanding/writing complex SQL queries /SP. Should have working experience in SQL project . Mandatory Skills PL/SQL Good to Have Skills CONTROL M or SSRS

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a Database Performance & Data Modeling Specialist with a primary focus on optimizing schema structures, tuning SQL queries, and ensuring that data models are well-prepared for high-volume, real-time systems. Your responsibilities include designing data models that balance performance, flexibility, and scalability, conducting performance benchmarking to identify bottlenecks and propose improvements, analyzing slow queries to recommend indexing, denormalization, or schema revisions, monitoring query plans, memory usage, and caching strategies for cloud databases, and collaborating with developers and analysts to optimize application-to-database workflows. You must possess strong experience in database performance tuning, especially in GCP platforms like BigQuery, CloudSQL, and AlloyDB. Proficiency in schema refactoring, partitioning, clustering, and sharding techniques is essential. Familiarity with profiling tools, slow query logs, and GCP monitoring solutions is required, along with SQL optimization skills including query rewriting and execution plan analysis. Preferred skills include a background in mutual fund or high-frequency financial data modeling, hands-on experience with relational databases like PostgreSQL, MySQL, distributed caching, materialized views, and hybrid model structures. Soft skills that are crucial for this role include being precision-driven with an analytical mindset, a clear communicator with attention to detail, and possessing strong problem-solving and troubleshooting abilities. By joining this role, you will have the opportunity to shape high-performance data systems from the ground up, play a critical role in system scalability and responsiveness, and work with high-volume data in a cloud-native enterprise setting.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

NTT DATA is looking for a .NET Lead Engineer to join their team in Pune, Maharashtra, India. As a .NET Senior Manager, you will be responsible for leading the architecture definition, design, and development of web-based applications using .NET C# technology. You will work with a team to deliver high-quality software solutions for clients, focusing on creating fast, resilient, and scalable applications. In this role, you will lead a team of 5-8 people, architect modern cloud-native and microservice applications, and have experience in building distributed services in Azure. You will be involved in creating component designs, producing technical documentation, and ensuring the quality delivery of enterprise solutions. Additionally, you will collaborate with product management, development teams, and internal IT departments to meet business requirements and drive innovation. To be successful in this position, you should have at least 10 years of experience in architecting .NET C# web-based applications, 5+ years leading .NET application architecture, and proficiency in web service development, reporting, and analytics. Experience with Visual Studio, ASP.NET, C#, and SQL Server is required, along with a strong understanding of Object-Oriented Design and Service-Oriented Architecture. The ideal candidate is a lifelong learner, a team player, and an effective communicator. NTT DATA offers a supportive environment for professional growth and provides opportunities for skills development and career advancement. If you are passionate about technology and eager to contribute to a dynamic team, apply now and be a part of NTT DATA's innovative and forward-thinking organization.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

The Data Warehouse Engineer will be responsible for managing and optimizing data processes in an Azure environment using Snowflake. The ideal candidate should have solid SQL skills and a basic understanding of data modeling. Experience with CI/CD processes and Azure ADF is preferred. Additionally, expertise in ETL/ELT frameworks and ER/Studio would be a plus. As a Senior Data Warehouse Engineer, in addition to the core requirements, you will oversee other engineers while also being actively involved in data modeling and Snowflake SQL optimization. You will be responsible for conducting design reviews, code reviews, and deployment reviews with the engineering team. Familiarity with medallion architecture and experience in Healthcare or life sciences industry will be highly advantageous. At Myridius, we are committed to transforming the way businesses operate by offering tailored solutions in AI, data analytics, digital engineering, and cloud innovation. With over 50 years of expertise, we aim to drive organizations through the rapidly evolving landscapes of technology and business. Our integration of cutting-edge technology with deep domain knowledge enables businesses to seize new opportunities, drive significant growth, and maintain a competitive edge in the global market. We go beyond typical service delivery to craft transformative outcomes that help businesses not just adapt, but thrive in a world of continuous change. Discover how Myridius can elevate your business to new heights of innovation by visiting us at www.myridius.com and start leading the change.,

Posted 1 month ago

Apply

4.0 - 8.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Key Skills: PL/SQL, SQL Optimization, Data Migration, Stored Procedures, Data Integrity, Performance Tuning, Production Support. Roles & Responsibilities: Develop, test, and maintain PL/SQL procedures, packages, functions, and triggers. Work closely with business analysts and end-users to understand data requirements and deliver scalable solutions. Perform performance tuning and optimization of SQL queries. Participate in data migration and transformation activities. Create and manage complex stored procedures for reporting and integration purposes. Work with large data sets and ensure data quality and integrity. Collaborate with application developers to implement database changes and improvements. Handle production support and resolve database-related issues. Experience Requirement: 4-8 years of experience in PL/SQL development with a strong understanding of databases. Experience in performance tuning and query optimization. Familiarity with data migration and transformation processes. Hands-on experience in creating stored procedures for reporting and integration. Knowledge of data quality and integrity management. Education: M.B.A., B.Tech M.Tech (Dual), MCA, B.E., B.Tech, M. Tech, B. Sc.

Posted 1 month ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Overview PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Provide leadership and management to a team of data engineers, managing processes and their flow of work, vetting their designs, and mentoring them to realize their full potential. Act as a subject matter expert across different digital projects. Overseework with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 8+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment.

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Overview PepsiCo operates in an environment undergoing immense and rapid change.Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/ curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company . As a data enginee r , you will be the key technical expert building PepsiCo's data product s to drive a strong vision. You'll be empowered to create data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help developing very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Act as a subject matter expert across different digital projects. Overseework with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 4 + years of overall technology experience that includes at least 3 + years of hands-on software development, data engineering, and systems architecture. 3 + years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 3 + years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark , Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics

Posted 1 month ago

Apply

5.0 - 10.0 years

27 - 32 Lacs

Hyderabad

Work from Office

Overview Job TitleData Engineer L10 PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics and new product development. PepsiCos Enterprise Data Operations (EDO) team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Enterprise Data Operations (EDO) does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company Responsible for day-to-day data collection, transportation, maintenance/curation and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders Increase awareness about available data and democratize access to it across the company Responsibilities As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. The candidate should have at least 5+ years of experience working on cloud platforms. 2+ years of experience in Azure is needed. S/he will have to front end technical discussions with leads of different business sectors. S/he will have to be on top of all support issues pertaining to t Qualifications Bachelors in computer science engineering or related field Skills, Abilities, Knowledge Excellent communication skills, both verbal and written, and the ability to influence and demonstrate confidence in communications with senior level management. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to coordinate effectively with team. Positive and flexible attitude and adjust to different needs in an ever-changing environment. Foster a team culture of accountability, communication, and self-management. Proactively drive impact and engagement while bringing others along. Consistently attain/exceed individual and team goals Ability to learn quickly and adapt to new skills. Certified candidate in Azure Fundamental is preferred. Below are nice-to-have experience for the candidate: - Proficiency with GIT and understanding of DevOps pipelines - Data Quality frameworks using Great Expectation suite - Understanding of Cloud networking (VNET RBACs etc.) - Fair understanding of web applications Qualifications 14+ years of overall technology experience that includes at least 8+ years of hands-on software development, data engineering. 8+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala, etc.). 2+ years in cloud data engineering experience in Azure. Azure Certification is a plus. Experience with version control systems like Github and deployment & CI tools. Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools is a plus. Experience in working with large data sets and scaling applications like Kubernetes is a plus. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus Understanding metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BE/BTech/MCA (Regular) in Computer Science, Math, Physics, or other technical fields. The candidate must have thorough knowledge in Spark, SQL, Python, Databricks and Azure - Spark (Joins, Upserts, Deletes, Aggregates, Repartitioning, Optimizations, working with structured and unstructured data, framework designing etc.) - SQL (Joins, merge, aggregates, indexing, clustering, functions, stored procedures, optimizations etc.) - Python (Functions, modules, classes, tuples, lists, dictionaries, lists, error handling, multi-threading etc.) - Azure (Azure Data Factory, Service Bus, Log Analytics, Event Grid, Event Hub, Logic App, App services etc.) - Databricks (Clusters, pools, workflows, authorization, APIs, DBRs, AQE, optimizations etc.)

Posted 1 month ago

Apply

6.0 - 11.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems Responsibilities Active contributor to code development in projects and services. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries Qualifications 6+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or SnowFlake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields.

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Warehouse Engineer at Myridius, you will be responsible for working with solid SQL language skills and possessing basic knowledge of data modeling. Your role will involve collaborating with Snowflake in Azure, CI/CD process using any tooling. Additionally, familiarity with Azure ADF and ETL/ELT frameworks would be beneficial for this position. It would be advantageous to have experience in ER/Studio and a good understanding of Healthcare/life sciences industry. Knowledge of GxP processes will be a plus in this role. For a Senior Data Warehouse Engineer position, you will be overseeing engineers while actively engaging in the same tasks. Your responsibilities will include conducting design reviews, code reviews, and deployment reviews with engineers. You should have expertise in solid data modeling, preferably using ER/Studio or an equivalent tool. Optimizing Snowflake SQL queries to enhance performance and familiarity with medallion architecture will be key aspects of this role. At Myridius, we are dedicated to transforming the way businesses operate by offering tailored solutions in AI, data analytics, digital engineering, and cloud innovation. With over 50 years of expertise, we drive a new vision to propel organizations through rapidly evolving technology and business landscapes. Our commitment to exceeding expectations ensures measurable impact and fosters sustainable innovation. Together with our clients, we co-create solutions that anticipate future trends and help businesses thrive in a world of continuous change. If you are passionate about driving significant growth and maintaining a competitive edge in the global market, join Myridius in crafting transformative outcomes and elevating businesses to new heights of innovation. Visit www.myridius.com to learn more about how we lead the change.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You should have a B.Tech/B.E/MSc/MCA qualification along with a minimum of 10 years of experience. As a Software Architect - Cloud, your responsibilities will include architecting and implementing AI-driven Cloud/SaaS offerings. You will be required to research and design new frameworks and features for various products, ensuring they meet high-quality standards and are designed for scale, resiliency, and efficiency. Additionally, motivating and assisting lead and senior developers for their professional and technical growth, contributing to academic outreach programs, and participating in company branding activities will be part of your role. To qualify for this position, you must have experience in designing and delivering widely used enterprise-class SaaS applications, preferably in Marketing technologies. Knowledge of cloud computing infrastructure, AWS Certification, hands-on experience in scalable distributed systems, AI/ML technologies, big data technologies, in-memory databases, caching systems, ETL tools, containerization solutions like Kubernetes, large-scale RDBMS deployments, SQL optimization, Agile and Scrum development processes, Java, Spring technologies, Git, and DevOps practices are essential requirements for this role.,

Posted 1 month ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Sr Data Analyst As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. As a Sr Data Analyst for Targets Merch Data Analytics team youll: Support our world-class Merchandising leadership team at Target with critical data analysis that helps Merch business team to make profitable decisions. Enable faster, smarter and more scalable decision-making to compete and win the modern retail market. Collaborate with stakeholders and understand their priorities/roadmap to drive business strategies using data. Interface with Target business representatives to validate business requirements/requests for analysis and present final analytical results. Designs, develops, and delivers analytical solutions resulting in decision support or models Gathers required data and performs data analysis to support needs. Communicate impact of proposed solutions to business partners Evaluates processes, analyzes and interprets statistical data Develop business acumen and cultivate client relationships Presents results in a manner that the business partners can understand. Translate scientific methodology to business terms. Documents analytical methodologies used in the execution of analytical projects Participate in knowledge sharing system to support iterative model builds Adheres to corporate information protection standards. Keep up to date on industry trends, best practices, and emerging methodologies Requirements / About You: ExperienceOverall 5-8 years exp and relevant 3-5 years expQualificationB.Tech / B.E. or Masters in Statistics /Econometrics/Mathematics equivalent 1. Extensive exposure to Structured Query Language (SQL), SQL Optimization and DW/BI concepts. 2. Proven hands-on experience in BI Visualization tool (i.e. Tableau, Domo, MSTR10, Qlik) with ability to learn additional vendor and proprietary visualizations tools. 3. Strong knowledge of structured (i.e. Teradata, Oracle, Hive) and unstructured databases including Hadoop Distributed File System (HDFS). Exposure and extensive hands-on work with large data sets. 4. Hands on experience in R, Python, Hive or other open-source languages/database 5. Hands on experience in any advanced analytical techniques like Regression, Time-series models, Classification Techniques, etc. and conceptual understanding of all the techniques mentioned above 6. Git source code management & experience working in an agile environment. 7. Strong attention to detail, excellent diagnostic, and problem-solving skills 8. Highly self-motivated with a strong sense of urgency to be able to work both independently and in team settings in a fast-paced environment; capability to manage urgency timelines 9. Competent and curious to ask questions and learn to fill gaps, desire to teach and learn. 10. Excellent communication, service orientation and strong relationship building skills 11. Experience with Retail, Merchandising, Marketing will be strong addons Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 1 month ago

Apply

9.0 - 14.0 years

4 - 8 Lacs

Chennai, Tamil Nadu, India

On-site

Minimum 9+ years work experience as Database Administrator in MSSQL DBA support and worked in any other RDBMS platform such as Azure, Mysql, Sybase, Postgres (Secondary Skill) Sound knowledge of SQL server architecture and concepts Should have hands-on experience in troubleshooting DB performance issues. Should have hands-on experience in planning, setup, configuring, troubleshooting DB HA/ DR technologies (Always-On, Failover cluster, Log shipping, Replication) Experience in DB installation, upgrade, migration, patching, and other maintenance/ optimization tasks. Hands-on experience in database design, database backup and recovery procedures, access security and database integrity, physical data storage design and data storage management. Experience in setting up Azure SQL databases in PaaS/ IaaS and migration of databases from on-prem to Azure, configure HA/ DR solutions in Azure. Experience in Database standardization and Governance activities Experience in scripting languages PowerShell, and Automation tools like Ansible is required. Experience in SSIS and SSRS configuration, migration, troubleshooting. Experience in handling external audits for databases and working with Auditors for data validation and fixes. Experience in NoSQL databases is a plus. Experience in database containerization is a plus. Fix database security vulnerabilities and work closely with Information/Cyber security team. Experience on ITIL process such as Incident, Service Request, Change and Problem management. Must be willing to work on other RDBMS such as Mysql, Postgres. Required training and Knowledge transfer will be provided. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 1 month ago

Apply

7.0 - 17.0 years

11 - 27 Lacs

Mumbai City, Maharashtra, India

On-site

Description We are seeking a Performance Optimization Specialist to join our team in India. The ideal candidate will have extensive experience in analyzing and improving system performance, ensuring that our applications run efficiently and effectively. Responsibilities Analyze system performance metrics and identify areas for improvement. Develop and implement optimization strategies for applications and systems. Collaborate with development teams to ensure performance best practices are integrated into the software development lifecycle. Conduct performance testing and benchmarking to validate improvements. Monitor ongoing performance and make recommendations for future enhancements. Skills and Qualifications 7-17 years of experience in performance optimization or related field. Strong knowledge of performance tuning tools and methodologies. Proficiency in programming languages such as Java, C#, or Python. Experience with database performance optimization, including SQL tuning. Familiarity with cloud services and architecture, particularly AWS or Azure. Ability to analyze and interpret complex data sets. Strong problem-solving skills and attention to detail.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

We are seeking an experienced Power BI Developer to join our team. The role involves creating insightful and interactive reports and dashboards using Power BI, optimizing SQL queries, and troubleshooting data-related issues. The ideal candidate will have hands-on experience with complex DAX queries, various data sources, and different reporting modes (Import, Direct Query). Responsibilities include working with PostgreSQL, MS SQL, developing robust SQL code, writing stored procedures, and ensuring high-quality data modeling. Candidates should have expertise in SQL optimization, performance tuning, and working with Common Table Expressions (CTEs) and complex joins. The role requires proficiency in designing engaging visual reports, applying themes/templates, and keeping up with the latest Power BI features and best practices.

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Job Opportunity with BCT Consulting Pvt Ltd for the role Business Intelligence Engineer ( Client : AMAZON) Bangalore - work From Office . Disclaimer: Client Company: Amazon Third Party Payroll: BCT Service : BCT Consulting Pvt Ltd.https://www.bct-consulting.com/ BCT ConsultingBCT Consulting is the most preferred digital transformation services partner for many industry-leading organisations across the globe. We deliver the most dynamic, most reliable and cost-effective digital and enterprise services that accelerate significant growth in revenues, profitability and bottom lines.www.bct-consulting.comBCT Consulting is the most preferred digital transformation services partner for many industry-leading organizations across the world. We deliver dynamic, reliable and comprehensive digital and enterprise services that accelerate significant growth in revenues, profitability and bottom lines.An award-winning digital and enterprise services provider, we create value for enterprises leveraging powerful technologies and deep domain expertise. Our strong delivery and innovation engine serves customers across multiple verticalsBFSI, Government, Telecom, Energy, Oil & Gas, Retail & Manufacturing.With a growing employee base of 700+, we have the right skills and expertise in scaling up for large-scale specialized projects across multiple domains to augment your operational efficiency and help you acquire talent without increasing your manpower turnover.BCT Consulting is a group company of the Bahwan CyberTek Group, a global provider of digital transformation solutions in the areas of predictive analytics, digital experience and digital supply chain across 20 countries. BCT Consulting was a winner in the 2019 Deloitte Technology Fast 50 Awards for the fastest growing Tech Companies in India based on percentage revenue growth over three years. Job Title Business Intelligence Engineer Data Engineering & Cloud Migration Total Years of Experience : 3-5 Years Location : Bangalore -Work from office -(Monday to Friday -Normal shift) Role Summary As a Business Intelligence Engineer, you’ll bridge the gap between data engineering and business analytics. You’ll optimize SQL queries, drive data migrations, and architect scalable AWS solutions to deliver performant, reliable analytics and BI insights to stakeholders. Key Responsibilities SQL Query Optimization & Data Warehousing Data Migration & Engineering AWS Cloud BI Reporting & Insights Enablement Basic Qualifications Bachelor’s degree in CS, Engineering, Info Systems, or related field. 3 -5 years in BI/Data Engineering roles involving large-scale schemas, complex queries, and ETL development. Expert-level SQL skills and hands-on experience optimizing performance (indexes, partitioning, query refactors). Proven record of successful database/data warehouse migrations—especially to AWS (Redshift, RDS, Aurora, S3). Strong AWS proficiency: Redshift, Glue, DMS/DataSync, S3, EC2/Lambda, IAM, CloudFormation/Terraform. Skilled in scripting (Python, Shell, or PowerShell) to automate workflows. Comfortable collaborating across technical and business stakeholders, translating needs into scalable data solutions. Preferred Qualifications Experience with streaming frameworks (Kinesis, Kafka) or big data (Spark, Hadoop). Prior experience in BI/dashboard tools like Tableau, Quicksight, Power BI. Strong expertise in cloud cost optimization, monitoring, and security best practices

Posted 1 month ago

Apply

8.0 - 13.0 years

3 - 6 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Technical Skills: Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus.

Posted 1 month ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Develop and maintain web applications using PHP and Laravel. Collaborate with cross-functional teams to define, design, and ship new features. Ensure the performance, quality, and responsiveness of applications. Requirements: Proven experience as a PHP Developer with a focus on Laravel. Strong understanding of PHP web application development. Solid experience with Laravel framework. Proficient in database design and SQL optimization. Experience with front-end technologies (HTML, CSS, JavaScript). Understanding of version control systems, preferably Git. Knowledge of RESTful API development. Familiarity with agile development methodologies. Excellent problem-solving and communication skills. Ability to work both independently and collaboratively in a team environment. Bachelor's degree in Computer Science, Engineering, or a related field (preferred). Roles and Responsibilities: Develop and maintain web applications using PHP and Laravel. Collaborate with cross-functional teams to define, design, and ship new features. Ensure the performance, quality, and responsiveness of applications. Write clean, maintainable, and efficient code. Troubleshoot and debug applications. Participate in code reviews and provide constructive feedback. Stay updated with the latest industry trends and technologies. Contribute to all phases of the development lifecycle.

Posted 1 month ago

Apply

10.0 - 12.0 years

18 - 25 Lacs

Hyderabad

Work from Office

Responsibilities:- 1. Performance Tuning: Identify and resolve performance issues using Oracle diagnostic tools, such as Oracle Enterprise Manager, SQL Trace. 2. Database Administration: Perform database administration tasks, including backups, recovery, and maintenance. 3. SQL Optimization: Optimize SQL queries and indexing strategies to improve database performance. 4. Database Design: Collaborate with development teams to design and implement database architectures that meet performance and scalability requirements. 5. Capacity Planning: Monitor and analyze database capacity, including disk space, memory, and CPU usage. 6. Troubleshooting: Troubleshoot and resolve database issues, including performance problems, errors, and data corruption. 7. Documentation: Maintain technical documentation, including database design documents, performance tuning guides, and troubleshooting procedures. Requirements: 1. Oracle Certification: Oracle Certified Professional (OCP) or Oracle Certified Master (OCM) certification or others (any 1 is Mandatory). 2. Oracle Experience: 10+ years of experience as an Oracle DBA, with a focus on performance tuning. 3. Performance Tuning: Strong knowledge of Oracle performance tuning, including SQL optimization, indexing, and caching. 4. Database Administration: Experience with Oracle database administration, including backups, recovery, and maintenance. 5. Scripting and Programming: Proficiency in scripting languages like SQL, PL/SQL, and Python. 6. Communication: Excellent communication and interpersonal skills, with the ability to work effectively with technical and non-technical stakeholders.

Posted 1 month ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. As a Sr Data Analyst for Targets Merch Data Analytics team youll: Support our world-class Merchandising leadership team at Target with critical data analysis that helps Merch business team to make profitable decisions. Enable faster, smarter and more scalable decision-making to compete and win the modern retail market. Collaborate with stakeholders and understand their priorities/roadmap to drive business strategies using data. Interface with Target business representatives to validate business requirements/requests for analysis and present final analytical results. Designs, develops, and delivers analytical solutions resulting in decision support or models Gathers required data and performs data analysis to support needs. Communicate impact of proposed solutions to business partners Evaluates processes, analyzes and interprets statistical data Develop business acumen and cultivate client relationships Presents results in a manner that the business partners can understand. Translate scientific methodology to business terms. Documents analytical methodologies used in the execution of analytical projects Participate in knowledge sharing system to support iterative model builds Adheres to corporate information protection standards. Keep up to date on industry trends, best practices, and emerging methodologies / About You: ExperienceOverall 5-8 years exp and relevant 3-5 years expQualification:B.Tech/ B.E. or Masters in Statistics /Econometrics/Mathematics equivalent 1. Extensive exposure to Structured Query Language (SQL), SQL Optimization and DW/BI concepts.2. Proven hands-on experience in BI Visualization tool (i.e. Tableau, Domo, MSTR10, Qlik) with ability to learn additional vendor and proprietary visualizations tools.3. Strong knowledge of structured (i.e. Teradata, Oracle, Hive) and unstructured databases including Hadoop Distributed File System (HDFS). Exposure and extensive hands-on work with large data sets.4. Hands on experience in R, Python, Hive or other open-source languages/database5. Hands on experience in any advanced analytical techniques like Regression, Time-series models, Classification Techniques, etc. and conceptual understanding of all the techniques mentioned above6. Git source code management & experience working in an agile environment.7. Strong attention to detail, excellent diagnostic, and problem-solving skills8. Highly self-motivated with a strong sense of urgency to be able to work both independently and in team settings in a fast-paced environment; capability to manage urgency timelines9. Competent and curious to ask questions and learn to fill gaps, desire to teach and learn.10. Excellent communication, service orientation and strong relationship building skills11. Experience with Retail, Merchandising, Marketing will be strong addons

Posted 1 month ago

Apply

2.0 - 7.0 years

0 - 3 Lacs

Kochi

Remote

Hiring Remote Perl Developers with 5+ years of experience. Strong in Perl, Mason, PostgreSQL, Docker, CI/CD, and debugging. Work on backend systems, automation, and code modernization. Bonus if experienced with GenAI tools.

Posted 1 month ago

Apply

3.0 - 6.0 years

12 - 13 Lacs

Bengaluru

Hybrid

Hi all , we are looking for a role ETL Developer experience : 3 - 6 years notice period : Immediate - 15 days location : Bengaluru Core Technical Expertise Data warehousing & migration : Deep expertise in ETL tools like Informatica PowerCenter, relational databases, data modeling, data cleansing, SQL optimization, and performance tuning. Programming & scripting : Strong SQL programming skills, shell scripting (Unix), debugging, and handling large datasets. Toolset : Experience with JIRA, Confluence, GIT; working knowledge of scheduling tools and integration of multiple data sources. Bonus skills : Familiarity with Talend Enterprise and Azure/cloud/BigData technologies.

Posted 1 month ago

Apply

4.0 - 7.0 years

13 - 17 Lacs

Pune

Work from Office

Join us as a MSSQL Database-Site Reliability Engineering at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards You'll spearhead the evolution of our digital landscape, driving innovation and excellence You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences To be successful as a MSSQL Database-Site Reliability Engineering you should have experience with: Willingness to learn and investigate new products Experience with setup/configuration/support of SQL Server 2014,2016, 2017, 2019, 2022 Proven track record of implementing and leading SRE practices across large organizations or complex teams Expert-level knowledge of telemetry, monitoring, and platform observability tooling (e g , ESAAS etc) with experience in customizing and scaling these solutions Experience in implementing modern DevOps and SRE practices in enterprise environments Proven expertise in SQL query optimization and database performance tuning at scale Experience with DevOps automation tools such as Code versioning (git), JIRA, Ansible, Containers and Kubernetes , database CI/CD tools and their implementation Hands-on as DevOps on Ansible, Python, T SQL coding Experience configuring and all components of the MS SQL set Core DB engine and SSRS, SSAS, SSIS, Replication topologies, AlwaysOn, Service Broker, Log Shipping, Database Snapshots etc and Windows Clustering Must have extensive experience as Production support SQL Server DBA strong experience in large, high volume and high criticality environments Database maintenance and troubleshooting Must exhibit in-depth expertise in solving database contention problems (dead locks, blocking etc) Demonstrate extensive expertise in performance tuning including both query and server optimization Expert knowledge of backup and recovery Good knowledge on PowerShell scripting The resource would be handling major escalated incidents, complex problem record This includes periodic review of systems, capacity management, on boarding new services, patching coordination & maintain good health of these Take responsibility for completing tasks, focusing on requirements and planning to meet client needs, which the job holder helps to identify, advise and recommend technical solutions based on experience and Industry knowledge Driving Automations agenda Identifying the possibility of automation in database area and work to deliver it Some Other Highly Valued Skills May Include Desirable experience on database automation will be preferred Strong skills in designing and delivering platforms to hosts SQL server, covering HA and DR, at Enterprise scale Experience with tape backup tools like TSM/TDP/DDBoost/BoostFs/Rubrik etc Knowledge of the ITIL framework, vocabulary and best practices Understanding of Cloud Based Computing particularly RDS\Azure Expert in system configuration management tools such as Chef, Ansible for database server configurations Expert expertise with scripting languages(e g PowerShell,Python) for automation/migration tasks You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To apply software engineering techniques, automation, and best practices in incident response, to ensure the reliability, availability, and scalability of the systems, platforms, and technology through them Accountabilities Availability, performance, and scalability of systems and services through proactive monitoring, maintenance, and capacity planning Resolution, analysis and response to system outages and disruptions, and implement measures to prevent similar incidents from recurring Development of tools and scripts to automate operational processes, reducing manual workload, increasing efficiency, and improving system resilience Monitoring and optimisation of system performance and resource usage, identify and address bottlenecks, and implement best practices for performance tuning Collaboration with development teams to integrate best practices for reliability, scalability, and performance into the software development lifecycle, and work closely with other teams to ensure smooth and efficient operations Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness Collaborate closely with other functions/ business divisions Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda Take ownership for managing risk and strengthening controls in relation to the work done Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc) to solve problems creatively and effectively Communicate complex information 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience Influence or convince stakeholders to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window)

Posted 1 month ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Pune

Work from Office

Join us as a Oracle Associate Service Engineer II at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unapparelled customer experiences You may be assessed on the key critical skills relevant for success in role, such as experience with Oracle DBA, PostgreSQL or MS-SQL, SRE ,ESAAS, DevOps, Kubernetes, CI/CD tools as well as job-specific skillsets To be successful as a Oracle Associate Service Engineer II, you should have experience with: Basic/ Essential Qualifications Experience as a Senior Level Database Administrator, with a focus on Oracle and similar database technologies such as PostgreSQL or MS-SQL Proven track record of implementing and leading SRE practices across large organizations or complex teams Expert-level knowledge of telemetry, monitoring, and platform observability tooling (e g , ESAAS, Grafana, Prometheus,) with experience in customizing and scaling these solutions Experience in implementing modern DevOps and SRE practices in enterprise environments Proven expertise in SQL query optimization and database performance tuning at scale Experience with DevOps automation tools such as Code versioning (git), JIRA, Ansible, Containers and Kubernetes (OpenShift), database CI/CD tools and their implementation Recognized as a Oracle subject matter expert, serving as the technical escalation for complex database issues Proven experience in executing database migration strategies at enterprise-scale Exceptional collaboration skills with executives and cross-functional teams Extensive experience working with large, distributed, high-performance, high-availability databases with 24/7 uptime Expert knowledge of database administration on Oracle Expert in multiple versions of Oracle 11g, 12c, 19c Expert knowledge of RMAN, Dataguard, Database Performance Tuning, OEM, plus third-party products such as Imperva Understand and meet the scale, capacity, security, performance attributes, and requirements of the service and technology stack Responsible for debugging the code in case any bug is reported in the Oracle product and fixing the infrastructure/database related issues Hands-on as DevOps on Ansible, Python, PL SQL coding Focus on resilience, scaling, reliability, uptime, and robustness Strong technical skills in areas such as Oracle system administration, networking, cloud infrastructure, automation, and monitoring tools Expert knowledge of Real Application Cluster with Maximum Availability setup Must exhibit SME expertise in solving database contention, Performance Tuning problems (Slow sql response, slow database responce etc) Database, RAC, OEM, OID installation, configuration and integration Database Upgrade, Migration with ZDLR and Patching Expertize in Goldengate configuration, troubleshooting Database Security policies, fine grained auditing Must have extensive experience as Production support Oracle Database Administrator strong experience in handling large, high volume and high criticality environments Demonstrate extensive expertise in performance tuning including both query and database optimization Knowledge of the ITIL framework, vocabulary and best practices Excellent spoken and written English communication skills Desirable Skillsets/ Good To Have Working experience in a financial environment Cloud experience (AWS, Azure, GCP) Oracle Certified Performance Tuning Expert Oracle 19c, 12c or 11g Oracle Certified Maximum Availability Expert Oracle 19c, 12c or 11g Experience of working in a highly regulated environment and the associated processes and tools Expert in system configuration management tools such as Chef, Ansible for database server configurations Expert expertise with scripting languages (e g PowerShell, Python, Bash) for automation/migration tasks AWS Certified Database Specialty certification is a plus This role will be based out of Pune Purpose of the role To apply software engineering techniques, automation, and best practices in incident response, to ensure the reliability, availability, and scalability of the systems, platforms, and technology through them Accountabilities Availability, performance, and scalability of systems and services through proactive monitoring, maintenance, and capacity planning Resolution, analysis and response to system outages and disruptions, and implement measures to prevent similar incidents from recurring Development of tools and scripts to automate operational processes, reducing manual workload, increasing efficiency, and improving system resilience Monitoring and optimisation of system performance and resource usage, identify and address bottlenecks, and implement best practices for performance tuning Collaboration with development teams to integrate best practices for reliability, scalability, and performance into the software development lifecycle, and work closely with other teams to ensure smooth and efficient operations Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate Will have an impact on the work of related teams within the area Partner with other functions and business areas Takes responsibility for end results of a teams operational processing and activities Escalate breaches of policies / procedure appropriately Take responsibility for embedding new policies/ procedures adopted due to risk mitigation Advise and influence decision making within own area of expertise Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function Make evaluative judgements based on the analysis of factual information, paying attention to detail Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents Guide and persuade team members and communicate complex / sensitive information Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies