Jobs
Interviews

707 Query Optimization Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

7 - 9 Lacs

Noida, Hyderabad

Work from Office

Location: Hyderabad / Noida / Gurugram (Onsite preferred) Type: Contract Position (612 Months) Experience: 9+ Years Apply: Please share your resume with current CTC, expected CTC, location, and notice period to [your-email@example.com] We are hiring a Senior Java Full Stack Developer for a contract role at one of our client locations. The ideal candidate should have 9+ years of strong hands-on experience in Java-based software development, front-end technologies, and cloud platforms. Key Responsibilities: Design and develop modern full stack applications using Java, Spring Boot, and ReactJS Build and maintain single-page web applications using HTML5, CSS3, Bootstrap, and JavaScript Develop and integrate RESTful APIs and microservices Work on Node.js backend services and support UI integration Architect, develop, and deploy applications in cloud environments like AWS or Azure Work with databases: RDBMS (PostgreSQL, SQL Server) and NoSQL (MongoDB, ElasticSearch) Ensure security using OAuth2.0, OpenID Connect, or similar frameworks Apply best practices for clean, maintainable, and efficient code Conduct code reviews, testing, and optimization for performance and scalability Required Skills: Strong proficiency in Java, Spring, Spring Boot Hands-on experience in ReactJS, NodeJS, and frontend architecture Cloud deployment experience on AWS, Azure, or other platforms Experience with REST API development & integration Experience with OAuth2, OpenID Connect, and security implementation Strong understanding of SQL, query optimization, and NoSQL Good problem-solving and debugging skills Experience with automated testing frameworks like Jest, Mocha, etc. To Apply, please share: Full Name Total Experience / Relevant Experience Current CTC Expected CTC Current Location Preferred Work Location (Hyderabad/Noida/Gurugram) Notice Period / Availability Send to: [your-email@example.com] (Replace with the actual contact)

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 15 Lacs

Hyderabad

Work from Office

Job Title: Database Developer Location: Madhapur Industry: IT Services & Consulting Department: Engineering - Software & QA Employment Type: Full-Time Role Category: DBA / Data Warehousing Job Description: We are on the lookout for a skilled Database Developers to join our team. In this role, you will work closely with our client to enhance their product and provide essential post-go-live support for users across the US, Bangkok, Philippines, Shanghai, and Penang. If you are passionate about database development and eager to tackle complex challenges, we invite you to apply! Key Responsibilities: Develop and implement product enhancements. Provide post-go-live production support, troubleshooting issues as they arise. Write and optimize complex SQL queries using advanced SQL functions. Perform query performance tuning, optimization, and debugging. Design and maintain database triggers, indexes, and views. Manage and understand complex data organization within RDBMS environments. Required Candidate Profile: Database Experience: Proficiency in Oracle, MySQL, or MSSQL SERVER. Stored Procedures Expertise: Strong background in Stored Procedures, including writing and debugging complex queries. Query Optimization: Proven expertise in query performance tuning and optimization. Database Design: Competency in writing triggers, and creating indexes and views. Industry Experience: Experience in the manufacturing domain is a significant advantage. Educational Requirements: Undergraduate Degree: Any Graduate Postgraduate Degree: Other Post Graduate - Other Specialization Doctorate: Other Doctorate - Other Specialization Key Skills: Query Optimization MySQL SQL Queries PL/SQL Data Warehousing Performance Tuning Oracle Role: Database Developer / Engineer If you are a proactive, detail-oriented database professional with a knack for problem-solving and performance tuning, we would love to hear from you. Apply now to join our dynamic team and make a meaningful impact!

Posted 1 month ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

Mohali

Work from Office

Senior SQL Cloud Database Administrator | CS Soft Solutions Senior SQL Cloud Database Administrator (DBA) Role And Responsibilities Managing, optimizing, and securing our cloud-based SQL databases, ensuring high availability and performance. Design and implement scalable and secure SQL database structures in AWS and GCP environments. Plan and execute data migration from on-premises or legacy systems to AWS and GCP cloud platforms. Monitor database performance, identify bottlenecks, and fine-tune queries and indexes for optimal efficiency. Implement and manage database security protocols, including encryption, access controls, and compliance with regulations. Develop and maintain robust backup and recovery strategies to ensure data integrity and availability. Perform regular maintenance tasks such as patching, updates, and troubleshooting database issues. Work closely with developers, DevOps, and data engineers to support application development and deployment. Ensure data quality, consistency, and governance across distributed systems. Keep up with emerging technologies, cloud services, and best practices in database management. Required Skills: Proven experience as a SQL Database Administrator with expertise in AWS and GCP cloud platforms. Strong knowledge of SQL database design, implementation, and optimization. Experience with data migration to cloud environments. Proficiency in performance monitoring and query optimization. Knowledge of database security protocols and compliance regulations. Familiarity with backup and disaster recovery strategies. Excellent troubleshooting and problem-solving skills. Strong collaboration and communication skills. Knowledge of DevOps integration

Posted 1 month ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

To design, build, and optimize scalable data pipelines and solutions using Azure Databricks and related technologies, enabling Zodiac Maritime to make faster, data-driven decisions as part of its data transformation journey. Proficiency in data integration techniques, ETL processes and data pipeline architectures. we'll versed in Data Quality rules, principles and implementation. Key Result Areas and Activities: Data Pipeline Development: Design and implement robust batch and streaming data pipelines using Azure Databricks and Spark. Data Architecture Implementation: Apply Medallion Architecture to structure data layers (raw, enriched, curated). Data Quality & Governance: Ensure data accuracy, consistency, and governance using tools like Azure Purview and Unity Catalog. Performance Optimization: Optimize Spark jobs, Delta Lake tables, and SQL queries for efficiency and cost-effectiveness. Collaboration & Delivery: Work closely with analysts, architects, and business teams to deliver end-to-end data solutions. Technical Experience : Must Have: Hands-on experience with Azure Databricks, Delta Lake, Data Factory. Proficiency in Python, PySpark, and SQL with strong query optimization skills. Deep understanding of Lakehouse architecture and Medallion design patterns. Experience building scalable ETL/ELT pipelines and data transformations. Familiarity with Git, CI/CD pipelines, and Agile methodologies. Good To Have: Knowledge of data quality frameworks and monitoring practices. Experience with Power BI or other data visualization tools. Understanding of IoT data pipelines and streaming technologies like Kafka/Event Hubs. Awareness of emerging technologies such as Knowledge Graphs. Qualifications: Education: Likely a degree in Computer Science, Data Engineering, Information Systems, or a related field. Experience: Proven hands-on experience with Azure data stack (Databricks, Data Factory, Delta Lake). Experience in building scalable ETL/ELT pipelines. Familiarity with data governance and DevOps practices. Qualities: Strong problem-solving and analytical skills Attention to detail and commitment to data quality Collaborative mindset and effective communication Proactive and self-driven Passion for learning and staying updated with emerging data technologies

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

JD : 5+ years of experience in software engineeringStrong proficiency in SQL, with a deep understanding of query optimization and performance tuningExperience in implementing automated SQL code review using AI/ML techniques to identify performance bottlenecks and suggest query optimizationsExperience working with GCP servicesSolid hands-on experience with Python for scriptingExperience with automation of GitHub ActionsHands-on experience in designing, developing and deploying microservicesExperience in building APIs in fastapi/flask for data services and system integration

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Noida

Work from Office

Proficient in database technologies with a specific understanding of RDBMS like PostgreSQL, MySQL, and NoSQL data stores such as HBase, MongoDB, etc. Database query development (write and optimize DB queries) and Migration.

Posted 1 month ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Number of Openings 1 ECMS ID in sourcing stage 530186 Assignment Duration 12 Months Total Yrs. of Experience 7+ Relevant Yrs. of experience 5 yrs Detailed JD (Roles and Responsibilities) Develop and implement logical and Physical Data models to meet business requirements Experience in Data Modeling tools Erwin/Power Designer Develop, optimize, and maintain DB tables, Schemas, Procedures etc Ensure DB performance through query optimization and indexing techniques Strong experience with Relational Databases Create and maintain documentation for data processes and architectures Mandatory skills Data Modeling, Erwin or Power Designer, RDBMS Desired/ Secondary skills Knowledge on ETL Vendor Rate 8500 INR/Day Delivery Anchor for tracking the sourcing statistics, technical evaluation, interviews and feedback etc. Selvakumar_R Work Location given in ECMS ID Bangalore Is it complete WFO or Hybrid model (specify the days) Hybrid (3 days) BG Check (Before OR After onboarding) Before Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Experience: 6+ Years Location: Bangalore, Chennai, Gurgaon Join our engineering team as a Senior Backend Engineer and lead the development of cloud-native, scalable microservices and RESTful APIs using modern Python frameworks You'll work with Docker, AWS, and CI/CD tools to build robust backend systems powering next-gen platforms If you have hands-on experience with FastAPI, Flask, or Django, and are skilled in distributed systems, Kafka, and relational/NoSQL databases, we want to hear from you Key Responsibilities: Microservices Development: Design, build, and optimize microservices architecture using patterns like Service Discovery, Circuit Breaker, API Gateway, and Saga orchestration REST API Engineering: Develop high-performance RESTful APIs using Python frameworks like FastAPI, Flask, or Django REST Framework Cloud-Native Backend Systems: Build and deploy containerized applications using Docker Familiarity with Kubernetes (K8s) for orchestration is a plus CI/CD Automation: Create and maintain DevOps pipelines using GitLab CI/CD, GitHub Actions, or Jenkins for automated testing and deployment Source Code Management: Collaborate through Git-based version control, ensuring code quality via pull requests and peer reviews on platforms like GitHub or GitLab Event-Driven Architecture: Implement and manage data streaming and messaging pipelines with Apache Kafka, Amazon Kinesis, or equivalent Database Engineering: Work with PostgreSQL, MySQL, and optionally NoSQL solutions such as MongoDB, DynamoDB, or Cassandra Cloud Infrastructure: Architect and manage AWS backend services using EC2, ECS, S3, Lambda, RDS, and CloudFormation Big Data Integration (Desirable): Leverage PySpark for distributed data processing and scalable ETL workflows in data engineering pipelines Polyglot Collaboration: Integrate with backend services or data processors developed in Java, Scala, or other enterprise technologies Required Skills & Qualifications: Bachelor's or Master's in Computer Science, Software Engineering, or a related technical field 6+ years in backend development using Python Proven expertise in API development, microservices, and cloud-native applications Proficiency in SQL, database schema design, and query optimization Strong grasp of DevOps best practices, Git workflows, and code quality standards Experience with streaming platforms, message queues, or event-driven design Nice to Have: Experience with Kubernetes, Terraform, or CloudWatch Exposure to big data tools (e g , Spark, Airflow, Glue) Familiarity with Agile/Scrum methodologies and cross-functional teams Benefits: Competitive salary and performance-based bonuses Opportunity to build next-gen backend platforms for global-scale applications Work with a team that values engineering best practices, code quality, and continuous learning Flexible work model with remote and hybrid options

Posted 1 month ago

Apply

1.0 - 2.0 years

1 - 2 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Title: Node.js Backend Developer Client: Neutrinos Location: Bangalore About the Role We are seeking experienced Node.js Backend Engineers to develop and optimize microservices for data-intensive applications . This is a hands-on role focused on building scalable systems, integrating with event-driven architectures, and delivering high-performance backend solutions. Key Responsibilities 1. Backend Development Design, build, and maintain microservices using Node.js (Express.js or Nest.js). Ensure clean, modular architecture with scalability and maintainability as core priorities. 2. Performance Optimization Optimize Node.js runtime performance and reduce API latency. Implement caching strategies (e.g., Redis) for throughput and response time improvements. 3. Kafka Integration Design and manage Kafka consumers/producers for event-driven microservices. Collaborate with data teams on message schemas and data flow orchestration . Use gRPC for efficient inter-service communication where necessary. 4. Observability & Monitoring Integrate Open Telemetry or similar tools for monitoring, tracing, and logging. Implement logging best practices and metrics for production readiness and reliability. 5. Cross-functional Collaboration Work closely with Data Integration Engineers to ensure smooth pipeline integration. Coordinate with PostgreSQL experts on query optimization and database performance. Desired Candidate Profile Experience Minimum 1-2 years of hands-on experience in Node.js backend development . Solid background in microservices , event-driven systems , and high-volume data processing . Technical Skills Proficiency in RESTful APIs , JWT/OAuth , and core Node.js libraries. Strong knowledge of Kafka or similar messaging platforms. Familiarity with gRPC for structured, high-performance communication. Experience with observability tools like Open Telemetry , Prometheus, or Jaeger. Database Expertise Good command of PostgreSQL : writing optimized queries, indexing, and tuning. Understanding of partitioning , read/write optimization , and data modelling . Soft Skills & Team Collaboration Strong team player with experience in code reviews , mentoring, and guiding juniors. Comfortable working in Agile environments , participating in sprint planning , and collaborating across teams. Cultural Fit We value: High-performance mindset with attention to Quality, On-Time Delivery, Efficiency, and Accuracy . Passion for solving complex data challenges using modern engineering practices. Ownership, continuous learning, and a collaborative approach to problem-solving. Why Join Us Be part of a dynamic team pushing the boundaries of data-intensive software engineering , working with cutting-edge technologies including: Node.js microservices Kafka-driven architectures gRPC-based service communication Advanced observability and monitoring

Posted 1 month ago

Apply

4.0 - 6.0 years

9 - 13 Lacs

Indore, Pune

Work from Office

What will your role look like Develop requirements, wireframes, and dashboards for varied audiences related to financial and project management related metrics. These dashboards will pull from multiple data sources and require the need to format and aggregate data. Produce compelling and informative visualizations using various native and custom chart types. Create and maintain relationships between visuals, filters, bookmarks, and numeric/field parameters. Design strategic visual interactions that enhance the end-users experience using cross- filtering and cross- highlighting. Write and optimize DAX expressions to create measures and calculated columns. Create effective data models. Maintain relationship cardinality and cross- filtering between tables. Manage online deployment pipelines to test and publish Power BI reports. Manage user roles and implement row-level security to restrict data access. Clean, transform, reshape, and aggregate data from different sources such as Excel, SQL Server, SharePoint, etc. Create dynamic, reusable queries using parameters. Why you will love this role Besides a competitive package, an open workspace full of smart and pragmatic team members, with ever-growing opportunities for professional and personal growth Be a part of a learning culture where teamwork and collaboration are encouraged, diversity is valued and excellence, compassion, openness and ownership is rewarded We would like you to bring along Strong Power BI expertise with in-depth understanding of dataflows, datasets and integration with backend databases Python exposure will be a huge plus. M Language Knowledge of different data types and data structures like values, records, tables, lists, etc. Familiarity with built-in functions and the ability to write custom functions. Understand native query folding to optimize performance. SQL Experience writing and optimizing queries. Strong understanding of relational databases. Familiarity with common filtering functions (CALCULATE, FILTER, etc.) and iteration functions (SUMX, AVERAGEX, etc.) Understand the use cases for Import, DirectQuery, Dual, and Live data storage modes.

Posted 1 month ago

Apply

1.0 - 5.0 years

2 - 6 Lacs

Nagercoil

Work from Office

Job Summary: We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (e.g., Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Roles and Responsibilities Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.

Posted 1 month ago

Apply

5.0 - 9.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Educational Bachelor of Engineering,Bachelor Of Comp. Applications,Master Of Comp. Applications Service Line Application Development and Maintenance Responsibilities Analyzing user requirements, envisioning system features and functionality. Design, build, and maintain efficient, reusable, and reliable .Net codes by setting expectations and features priorities throughout development life cycle Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Additional Responsibilities: Min. 5 years of relevant .Net experience with team handling experience Must have design experience using best practices, Design Patterns, SDLC, OOP, OOD Must have experience in leading and mentoring teams Must be experienced in developing applications using SQL databases, schema, SQL queries Must be experienced in GIT and version control systems Must be skilled in Database constructs, schema design, SQL Server or Oracle, SQL Queries, query optimization. Must be hands-on experienced in MSTest or NUnit, Mocking frameworks, Jasmine, Karma, Cucumber Solid understanding of object-oriented programming Experience with both external and embedded databases Creating database schemas that represent and support business processes Implementing automated testing platforms and unit tests Good verbal and written communication skills Ability to communicate with remote teams in effective manner High flexibility to travel Strong analytical, logical skills and team leading skills Technical and Professional : .NET, ASP.NET, MVC, C#, WPF, WCF, SQL Server, Entity Framework Preferred Skills: Technology-Microsoft Technologies-.NET Frameworks Technology-ASP.Net-ASP.Net Web API

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Your responsibilities Develop and implement scalable applications using a mix of Microsoft technologies (Power Platform, Power Automate, .NET, SQL Server) and Pega, following best practices for architecture, coding, and design patterns. Build high-quality, maintainable, and test-covered solutions, ensuring seamless deployment across cloud (Azure, AWS) and on-premises environments with a focus on security and compliance. Design, develop, and integrate APIs and automation workflows leveraging Power Automate, Pega, and cloud-native services to enable seamless interoperability and process automation. Troubleshoot and resolve complex implementation, environment, and deployment issues across both Pega and Microsoft stacks, minimizing downtime and ensuring system reliability. Develop and automate comprehensive testing frameworks (unit, system) and CI/CD pipelines using tools like Azure DevOps and GitHub to support continuous integration and delivery. Analyze business requirements to translate them into robust technical solutions, applying secure development practices especially in payments processing and enterprise integrations. Leverage Agentic AI, advanced analytics, and data-driven insights to automate workflows, optimize processes, and enhance system intelligence within Pega and Microsoft environments. Stay current with emerging technologies and industry trends in Pega, Microsoft Power Platform, AI, and cloud computing, integrating new best practices into development workflows. Collaborate with cross-functional teams, including Solution Architects and SMEs, to prototype, validate, and refine scalable, enterprise-grade solutions. Develop, review, and maintain architecture artifacts, reference models, and platform initiatives impacting Pega, Power Platform, Azure, and other cloud ecosystems. Solid understanding of payment operations to ensure software solutions support secure, compliant, and efficient transaction processing aligned with business workflows. Implement monitoring and observability capabilities to track application performance, detect issues early, and ensure system health across Pega and Microsoft platforms. Your skills & Your Experience 7-10 years of experience with Power Platform (Power Automate, Power Apps), .NET, and SQL Server Strong expertise in database design, schema development, and query optimization Experience developing scalable, secure enterprise applications on cloud and on-premises API design, development, and system integration, with some Pega development experience Troubleshooting complex deployment and performance issues across Pega, Microsoft, and database layers CI/CD pipeline automation using Azure DevOps, GitHub Knowledge of secure payments processing, database encryption, and compliance standards Monitoring and observability tools for database and system health management Experience with AI, advanced analytics, and workflow automation Collaboration with cross-functional teams and stakeholders Understanding of payment operations, business workflows, and data security best practices

Posted 1 month ago

Apply

7.0 - 12.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Your responsibilities Develop and implement scalable applications using a mix of Microsoft technologies (Power Platform, Power Automate, .NET, SQL Server) and Pega, following best practices for architecture, coding, and design patterns. Build high-quality, maintainable, and test-covered solutions, ensuring seamless deployment across cloud (Azure, AWS) and on-premises environments with a focus on security and compliance. Design, develop, and integrate APIs and automation workflows leveraging Power Automate, Pega, and cloud-native services to enable seamless interoperability and process automation. Troubleshoot and resolve complex implementation, environment, and deployment issues across both Pega and Microsoft stacks, minimizing downtime and ensuring system reliability. Develop and automate comprehensive testing frameworks (unit, system) and CI/CD pipelines using tools like Azure DevOps and GitHub to support continuous integration and delivery. Analyze business requirements to translate them into robust technical solutions, applying secure development practices especially in payments processing and enterprise integrations. Leverage Agentic AI, advanced analytics, and data-driven insights to automate workflows, optimize processes, and enhance system intelligence within Pega and Microsoft environments. Stay current with emerging technologies and industry trends in Pega, Microsoft Power Platform, AI, and cloud computing, integrating new best practices into development workflows. Collaborate with cross-functional teams, including Solution Architects and SMEs, to prototype, validate, and refine scalable, enterprise-grade solutions. Develop, review, and maintain architecture artifacts, reference models, and platform initiatives impacting Pega, Power Platform, Azure, and other cloud ecosystems. Solid understanding of payment operations to ensure software solutions support secure, compliant, and efficient transaction processing aligned with business workflows. Implement monitoring and observability capabilities to track application performance, detect issues early, and ensure system health across Pega and Microsoft platforms. Your skills & Your Experience 5-7 years of experience with Power Platform (Power Automate, Power Apps), .NET, and SQL Server Strong expertise in database design, schema development, and query optimization Experience developing scalable, secure enterprise applications on cloud and on-premises API design, development, and system integration, with some Pega development experience Troubleshooting complex deployment and performance issues across Pega, Microsoft, and database layers CI/CD pipeline automation using Azure DevOps, GitHub Knowledge of secure payments processing, database encryption, and compliance standards Monitoring and observability tools for database and system health management Experience with AI, advanced analytics, and workflow automation Collaboration with cross-functional teams and stakeholders Understanding of payment operations, business workflows, and data security best practices

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Educational Bachelor Of Comp. Applications,Bachelor of Engineering,Bachelor Of Technology,Master Of Technology,Master Of Engineering,Master Of Science Service Line Cloud & Infrastructure Services Responsibilities Roles and Responsibilities: Set up and support HA/DR solutions/ replication Lead efforts related to system and SQL performance tuning, index/ partition creation and management Set up log shipping, Mirroring/log forwarding, analyzing traces Architect, Design, Implement, Administer database consolidation platforms Perform duties including monitoring, software installs and upgrades, scripting, automation, incident response and documentation Perform DB restores and point-in-time recoveryIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Experience in troubleshooting and resolving database integrity issues, performance issues, blocking and deadlocking issues, replication issues, log shipping issues, connectivity issues, security issues etc. Hands on experience in Performance Tuning, Query Optimization, monitoring and troubleshooting tools. Solid understanding of how indexes, index management, integrity checks, configuration, patching. How statistics work, how indexes are stored, how they can be created and managed effectively Technical and Professional : Technology | Database Administration | MS SQL Server, Technology | Database Administration| Oracle DBA, Technology | Database Administration| Postgres SQL Preferred Skills: Technology-Database-Database- ALL Technology-Database-Oracle Database Technology-Database Administration-MS SQL Server-SQL Server Technology-Database Administration-PostGreSQL

Posted 1 month ago

Apply

12.0 - 17.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HANA DB Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : BTech:Install, configure, and administer SAP HANA databases across various environments.Perform database upgrades, patches, and system refreshes.Monitor & optimize SAP HANA memory, CPU, and disk utilization to ensure high availability and reliability.Implement and manage SAP HANA high availability (HA) and disaster recovery (DR) solutions.Conduct performance tuning and query optimization to enhance system efficiency.Manage backup, restore, and disaster recovery procedures for SAP HANA databases.Troubleshoot and resolve database-related issues, working closely with application teams and SAP support.Ensure database security, user management, and authorization compliance in accordance with IT policies.Collaborate with SAP BASIS, infrastructure, and security teams to support SAP applications running on HANA.Document best practices, troubleshooting steps, and operational procedures for SAP HANA administration. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Collaborate with stakeholders to understand application requirements.- Integrate functional, security, integration, performance, quality, and operations requirements.- Review and integrate technical architecture requirements.- Provide input into final decisions regarding hardware, network products, system software, and security.- Ensure successful integration of application and technical architecture.- Analyze requirements and provide solutions to problems.- Manage and coordinate with team members.- Stay updated with industry trends and advancements.- Conduct research and make recommendations for improvements.- Ensure compliance with coding standards and best practices.- Identify and resolve technical issues and bugs.- Collaborate with cross-functional teams to deliver high-quality solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HANA DB Administration.- Good To Have Skills: Experience with SAP BASIS Administration.- Strong understanding of database administration principles.- Experience in managing and optimizing SAP HANA databases.- Knowledge of SAP HANA architecture and components.- Experience in performance tuning and troubleshooting.- Familiarity with SAP HANA security and authorization concepts.- Experience in backup and recovery strategies for SAP HANA databases. Additional Information:- The candidate should have a minimum of 12 years of experience in SAP HANA DB Administration.- This position is based at our Pune office.- A 15 years full-time education is required. Qualification BTech

Posted 1 month ago

Apply

10.0 - 15.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Bachelors Degree in Computer Science, Information Technology, or equivalent degree and/or experience 10+ plus years of software development experience Primary Skill8+ years of experience as a MS SQL DB Developer Should know SSRS (Report creations and server configuration) Hands on experience on following DB Development skills: SQL querying Join operations SP writing SQL Functions / Triggers / Indexing / Temp Tables CTE Cursors Query Optimization Tables / Views (Creation / Update) System Objects SQL Server security Synonyms SQL Profiler Query analyzer Gitlab CICD Pipeline Powershell scripting Google Cloud Firestore , Firebase Hosting skills and experience will be an added advantage

Posted 1 month ago

Apply

12.0 - 17.0 years

6 - 10 Lacs

Mumbai

Work from Office

Role Overview : We are looking for an experienced Denodo SME to design, implement, and optimize data virtualization solutions using Denodo as the enterprise semantic and access layer over a Cloudera-based data lakehouse. The ideal candidate will lead the integration of structured and semi-structured data across systems, enabling unified access for analytics, BI, and operational use cases. Key Responsibilities: Design and deploy the Denodo Platform for data virtualization over Cloudera, RDBMS, APIs, and external data sources. Define logical data models , derived views, and metadata mappings across layers (integration, business, presentation). Connect to Cloudera Hive, Impala, Apache Iceberg , Oracle, and other on-prem/cloud sources. Publish REST/SOAP APIs, JDBC/ODBC endpoints for downstream analytics and applications. Tune virtual views, caching strategies, and federation techniques to meet performance SLAs for high-volume data access. Implement Denodo smart query acceleration , usage monitoring, and access governance. Configure role-based access control (RBAC) , row/column-level security, and integrate with enterprise identity providers (LDAP, Kerberos, SSO). Work with data governance teams to align Denodo with enterprise metadata catalogs (e.g., Apache Atlas, Talend). Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required : 8–12 years in data engineering, with 4+ years of hands-on experience in Denodo Platform . Strong experience integrating RDBMS (Oracle, SQL Server), Cloudera CDP (Hive, Iceberg), and REST/SOAP APIs. Denodo Admin Tool, VQL, Scheduler, Data Catalog; SQL, Shell scripting, basic Python (preferred). Deep understanding of query optimization , caching, memory management, and federation principles. Experience implementing data security, masking, and user access control in Denodo.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Manage and optimize Azure Cosmos DB, ensuring efficient partitioning, indexing, and performance tuning. Maintain .NET Core applications, ensuring seamless database connectivity and high performance. Monitor and troubleshoot Azure database infrastructure including Cosmos DB, Redis Cache, and Azure SQL. Implement backup, disaster recovery, and high availability strategies across multiple regions. Automate database operations, provisioning, and monitoring using Azure DevOps (CI/CD) and IaC (Terraform, Bicep, ARM). Work with APIM, App Services, Function Apps, and Logic Apps for cloud-native database solutions. Optimize Azure Storage Containers, Cognitive Search, and Form Recognizer for data processing and retrieval. Ensure database security, authentication (OAuth, JWT), and compliance with PMI standards. Strong expertise in query optimization, performance troubleshooting, and RU cost management in Cosmos DB. Hands-on experience with Azure Monitor, Log Analytics, and Application Insights for proactive monitoring and performance insights

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Manage and optimize Azure Cosmos DB ensuring efficient partitioning indexing and performance tuning Maintain NET Core applications ensuring seamless database connectivity and high performance Monitor and troubleshoot Azure database infrastructure including Cosmos DB Redis Cache and Azure SQL Implement backup disaster recovery and high availability strategies across multiple regions Automate database operations provisioning and monitoring using Azure DevOps CI CD and IaC Terraform Bicep ARM Work with APIM App Services Function Apps and Logic Apps for cloud native database solutions Optimize Azure Storage Containers Cognitive Search and Form Recognizer for data processing and retrieval Ensure database security authentication OAuth JWT and compliance with PMI standards Strong expertise in query optimization performance troubleshooting and RU cost management in Cosmos DB Hands on experience with Azure Monitor Log Analytics and Application Insights for proactive monitoring and performance insights

Posted 1 month ago

Apply

12.0 - 17.0 years

14 - 19 Lacs

Gurugram

Work from Office

Urgent Opening for Sr Database Administrator Posted On 04th Jul 2019 12:14 PM Location Gurgaon Role / Position Senior Database Administrator Experience (required) 15 plus years Description Designation: Designation SeniorDatabase Administrator LocationGurgaon Primary responsibilities of this role would include owning, tracking and resolving database related incidents and requests, participation in design of database architecture for current and future products. The SQL Server DBA will be responsible for the implementation, configuration, maintenance, and performance of critical SQL Server RDBMS systems, to ensure the availability and consistent performance of our applications. This is a hands-on position requiring solid technical skills, as well as excellent interpersonal and communication skills. The successful candidate will be responsible for the development and sustainment of the SQL Server, ensuring its operational readiness (security, health and performance), executing data loads, and performing data modelling in support of multiple development teams. The OLTP Databases and Data warehouse supports an enterprise application suite of program management tools. Must be capable of working independently and collaboratively. Troubleshooting and resolving database integrity, performance, blocking and HA issues etc. Knowledge of SQL Server tools (Profiler, DTA, SSMS, PerfMon, DMVs etc). Responsibilities Responding todatabase related alerts, escalations and working with research/development teams to implement strategic solutions.Conduct SQL Server lunch-and-learn sessions for application developers to share domain andtechnical expertise.Capable of multi-tasking and working with a variety of people.Provide data to business users for analysis.Create complex analytic queries on large data setsIndependently analyse, solve, and correct issues in real time, providing problem resolution end-to-end.Hands-on support of largedatabases , including, but not limited to, monitoring, re-indexing, generaldatabase maintenance, extract- transformation-load, backup/recovery, documentation, and configuration. Qualification 12 + years of experience as a Microsoft SQL Server database administrator with Development using TSQL, SSIS under MSSQL Server 2005/2008/2012/2014.Experience in upgrading databases from 2005/2008 to 2012/2014. Strong Knowledge of high availability architecture including clustering, AlwaysOn & hands on knowledge with MS SQL Business Intelligence offering products Analysis Services, Reporting Services and Integration Services. Experience designing logical and physical databases for OLTP and Data warehouse. Experience in Performance Tuning, Query Optimization, using Performance Monitor, SQL Profiler and other related monitoring and troubleshooting tools. Ability to identify and troubleshoot SQL Server related CPU, memory, I/O, disk space and other resource contention. SQL Development ability to write and troubleshoot SQL Code and design (stored procedures, functions, tables, views, triggers, indexes, constraints). SQL Development- experience in creating database architecture with associated schema as part of a software design process. Familiarity with windows server, security delegation, SPNs, storage components. Ability to network with documentation and testing teams to accomplish creation of processes and procedures. A highly self-motivated individual with the ability to work effectively in a collaborative, team-oriented IT environment Must have 2-3 years of experience as .Net developer with strong understanding of database structures, theories, principles, and practices. Send Resumes to girish.expertiz@gmail.com -->Upload Resume

Posted 1 month ago

Apply

6.0 - 9.0 years

27 - 42 Lacs

Pune

Work from Office

About the role As a Big Data Engineer, you will make an impact by identifying and closing consulting services in the major UK banks. You will be a valued member of the BFSI team and work collaboratively with manager, primary team and other stakeholders in the unit. In this role, you will: Collaborate with cross-functional teams to improve data ingestion, transformation, and validation workflows Work closely with Data Engineers, Architects, and Analysts to understand data reconciliation requirements Develop and implement PySpark programs to process large datasets in Big data platforms Analyze and comprehend existing data ingestion and reconciliation frameworks Perform complex transformations including reconciliation and advanced data manipulations Fine-tune Spark jobs for performance optimization, ensuring efficient data processing at scale Work model We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 3 days a week in a client or Cognizant office in Pune/Hyderabad location. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs. What you must have to be considered Design and implement data pipelines, ETL processes, and data storage solutions that support data-intensive applications Extensive hands-on experience with Python, PySpark Good at Data Warehousing concepts & well versed with structured, semi structured (Json, XML, Avro, Parquet) data processing using Spark/Pyspark data pipelines Experience working with large-scale distributed data processing, and solid understanding of Big Data architecture and distributed computing frameworks Proficiency in Python and Spark Data Frame API, and strong experience in complex data transformations using PySpark These will help you stand out Able to leverage Python libraries such as cryptography or pycryptodome along with PySpark's User Defined Functions (UDFs) to encrypt and decrypt data within your Spark workflows Should have worked on Data risk metrics in PySpark & Excellent at Data partitioning, Z-value generation, Query optimization, spatial data processing and optimization Experience with CI/CD for data pipelines Must have working experience in any of the cloud environment AWS/Azure/GCP Proven experience in an Agile/Scrum team environment Experience in development of loosely coupled API based systems We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Hyderabad, Bengaluru

Work from Office

Skill-Snowflake Developer with Data Build Tool with ADF with Python Job Descripion: We are looking for a Data Engineer with experience in data warehouse projects, strong expertise in Snowflake , and hands-on knowledge of Azure Data Factory (ADF) and dbt (Data Build Tool). Proficiency in Python scripting will be an added advantage. Key Responsibilities Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. Implement data transformations, testing, and documentation using dbt. Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. Troubleshoot data-related issues. (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications Experience in Data Warehousing with a strong understanding of best practices. Hands-on experience with Snowflake (Data Modeling, Query Optimization). Proficiency in Azure Data Factory (ADF) for data pipeline development. Strong working knowledge of dbt (Data Build Tool) for data transformations. (Optional) Experience in Python scripting for automation and data manipulation. Good understanding of SQL and query optimization techniques. Experience in cloud-based data solutions (Azure). Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Administer PostgreSQL databases, ensuring optimal performance, security, and backup processes. Work on data migration, query optimization, and troubleshooting issues to ensure database reliability.

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Chennai

Work from Office

Manage and maintain MySQL databases, ensuring optimal performance, security, and data integrity. You will handle database backups, troubleshooting, performance tuning, and ensure high availability. Strong experience with MySQL database administration, SQL, and optimization is required.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies