Jobs
Interviews

869 Indexes Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: AI/GenAI Engineer Job ID: POS-13731 Primary Skill: Databricks, ADF Location: Hyderabad Experience: 3.00 Secondary skills: Python, LLM, Langchain, Vectors, and AWS Mode of Work: Work from Office Experience : 2-3 Years About The Job We are seeking a highly motivated and innovative Generative AI Engineer to join our team and drive the exploration of cutting-edge AI capabilities. You will be at forefront of developing solutions using Generative AI technologies, primarily focusing on Large Language Models (LLMs) and foundation models, deployed on either AWS or Azure cloud platforms. This role involves rapid prototyping, experimentation, and collaboration with various stakeholders to assess the feasibility and potential impact of GenAI solutions on our business challenges. If you are passionate about the potential of GenAI and enjoy hands-on building in a fast-paced environment, this is the role for you. Know Your Team At ValueMomentum’s Engineering Center , we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through a strong engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Develop GenAI Solutions: Develop, and rapidly iterate on GenAI solutions leveraging LLMs and other foundation models available on AWS and/or Azure platforms. Cloud Platform Implementation: Utilize relevant cloud services (e.g., AWS SageMaker, Bedrock, Lambda, Step Functions; Azure Machine Learning, Azure OpenAI Service, Azure Functions) for model access, deployment, data processing. Explore GenAI Techniques: Experiment with and implement techniques like Retrieval-Augmented Generation (RAG), evaluating the feasibility of model fine-tuning or other adaptation methods for specific PoC requirements. API Integration: Integrate GenAI models (via APIs from cloud providers, OpenAI, Hugging Face, etc.) into prototype applications and workflows. Data Handling for AI: Prepare, manage, and process data required for GenAI tasks, such as data for RAG indexes, datasets for evaluating fine-tuning feasibility, or example data for few-shot prompting. Documentation & Presentation: Clearly document PoC architectures, implementation details, findings, limitations, and results for both technical and non-technical audiences. Requirements Overall, 2-3 years of experience. Expert in Python with advance programming and concepts Solid understanding of Generative AI concepts, including LLMs, foundation models, prompt engineering, embeddings, and common architectures (e.g., RAG). Demonstrable experience working with at least one major cloud platform (AWS or Azure). Hands-on experience using cloud-based AI/ML services relevant to GenAI (e.g., AWS SageMaker, Bedrock; Azure Machine Learning, Azure OpenAI Service). Experience interacting with APIs, particularly AI/ML model APIs Bachelor’s degree in computer science, AI, Data Science or equivalent practical experience. About The Company Headquartered in New Jersey, US, ValueMomentum is the largest standalone provider of IT Services and Solutions to Insurers. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver the value we promise and drive momentum to our customers’ initiatives. ValueMomentum is amongst the top 10 insurance-focused IT services firms in North America by number of customers. Leading Insurance firms trust ValueMomentum with their Digital, Data, Core, and IT Transformation initiatives. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Job Description Perform DBA Tasks like SQL Server Installation, Backups, Configure HADR, Clustering, AG and Log shipping Install, configure, and maintain PostgreSQL database systems Install, configure, and maintain MondoDB(No SQL) database systems AWS, Azure Knowledge, Database server performance Analysis Audit and Compliance Knowledge of Databases Design database solutions using tables, stored procedures, functions, views, and indexes Data Transfer from Dev environment to Production and other related environment Schema Comparison, Bulk operations, Server side coding Understanding Normalization, Denormalization, Primary Keys, Foreign Keys and Constraints, Transactions, ACID, Indexes as optimization tool, Views Working with Database Manager in creating physical tables from logical models ETL, Data Migration (using CSV, EXCEL, TXT files),Adhoc Reporting Migration of Database from Older Version of SQL Server to New Versions Distributed DB's, Remote Server and configuring Link Servers Integrating SQL Server with Oracle using Open queries. Job Requirements/Qualifications Good Knowledge of MongoDB and PostgreSQL both OnPrem and Cloud Native servers Good Knowledge of SQL Server All versions including SAAS and PAAS model SSIS, SSRS BULK copy tools like BCP, DTS etc., Good Knowledge in DML, DDL, ETL, Table Level Backups, System objects BACKEND DATA Upload techniques Performance tuning Good in Excel (Pivoting and Analysis) Familiarity with ISO 20000, ISO 27001, PCI/DSS and other related standards Excellent communication skills (Interpersonal, Verbal & Written) Ability to multi-task and manage multiple priorities Should have high energy and a passion for helping people Customer focused Should be able to work 24 /7 including Night shifts Demonstrate ownership, commitment and accountability Ability to build and maintain efficient working relationships and strong people management skill Ability to interact at levels Candidate should be Computer Science Graduate or Equivalent The candidate should have completed 5 Years on Databases Administration MSSQL or in the similar role The candidate should have completed 2 Years on Databases Administration (PostgreSQL & MongoDB) (ref:hirist.tech)

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Hands on experience in Informatica Intelligent Cloud services (IICS) Should be expertise in performance and tuning of Informatica mappings and sessions for better performance and moving the objects from development to production environment. Hands on experience in all aspects of project development life cycle in Data Warehousing. Expertise in Informatica PowerCenter (10.x/9.x/8.x), Mappings, Mapplets, Transformations, Workflow Manager, Workflow Monitor, Repository Manager, FACT Tables, Dimension Tables, Star Schema Modelling, OLTP, OLAP. Extensive experience with Data Extraction, Transformation, and Loading from disparate Data sources into a common reporting and analytical Data Model using Informatica. Should have experience to load data from heterogeneous sources like SAP, Oracle, XML and Flat files etc. Hands on Experience in code migration by Export and Import Object and Deployment Groups Experience with PL/SQL Coding, Normalized Database creation in relational databases like Oracle 12c//11g/10g Experience with UNIX Shell Scripting, PL/SQL, Oracle tuning and Indexes, function-based indexes and optimizing the SQL to improve the performances. (ref:hirist.tech)

Posted 2 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

Udaipur, Tripura, India

On-site

Location : Udaipur/Hybrid. Key Responsibilities Database Development : Design, develop, and maintain PostgreSQL databases within Kubernetes and Dockerized environments. Data Pipeline Management : Implement and manage data pipelines using Azure Data Factory and Azure Synapse Analytics. Performance Optimization : Optimize and maintain stored procedures, functions, views, and indexes for efficient data access and performance. Data Modeling : Develop and document logical and physical data models including Star and Snowflake schemas for enterprise-scale solutions. Schema Analysis : Analyze existing schemas and create ER diagrams and technical documentation for future-state database designs. Collaboration : Collaborate with application developers and customer teams to understand data requirements and translate them into scalable database solutions. CI/CD Implementation : Implement CI/CD pipelines and manage infrastructure as code (IaC) to automate environment provisioning and deployment. Performance Tuning : Conduct performance tuning and apply best practices for database optimization and high availability. Requirements :. Experience : 6-8 years of experience in database development with a strong focus on PostgreSQL. Enterprise Implementation : Proven experience working on enterprise-level implementations. Containerization : Hands-on experience with PostgreSQL in containerized environments (Kubernetes/Docker). Azure Services : Proficient in Azure services, especially Azure Synapse, Data Factory, and related data solutions. SQL Programming : Strong SQL programming skills with expertise in writing complex stored procedures and functions. Performance Tuning : Experience in database performance tuning, indexing strategies, and query optimization. CI/CD Tools : Understanding CI/CD tools and infrastructure automation using scripts or tools like Terraform, ARM templates, etc. Collaboration Skills : Ability to work collaboratively and independently, with strong problem-solving and communication skills. (ref:hirist.tech)

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Role Overview We are seeking an experienced and highly skilled PostgreSQL Database Administrator (DBA) to manage, maintain, and optimize our PostgreSQL database systems. The ideal candidate will be responsible for ensuring database availability, security, performance, and scalability. Responsibilities You will work closely with application developers, system engineers, and DevOps teams to provide high-quality data solutions and troubleshoot complex issues in a mission-critical Responsibilities : Install, configure, and upgrade PostgreSQL databases in high-availability environments Design and implement database architecture, including replication, partitioning, and sharding Perform daily database administration tasks including backups, restores, monitoring, and tuning Optimize queries, indexes, and overall performance of PostgreSQL systems Ensure high availability and disaster recovery by configuring replication (Streaming, Logical) and backup solutions (pgBackRest, Barman, WAL archiving) Implement and maintain security policies, user access control, and encryption Monitor database health using tools such as pgAdmin, Nagios, Zabbix, or Troubleshoot database-related issues in development, test, and production environments Automate routine tasks using shell scripting, Python, or Ansible Work with DevOps/SRE teams to integrate PostgreSQL into CI/CD pipelines and cloud Technical Skills : PostgreSQL Expertise : Proven experience with PostgreSQL 11+ (latest version experience preferred) Strong knowledge of SQL, PL/pgSQL, database objects, and data types Experience with PostgreSQL replication : streaming, logical, and hot standby Deep understanding of VACUUM, ANALYZE, autovacuum configuration and tuning Knowledge of PostGIS, pgBouncer, and pg_stat_statements is a Tuning & Monitoring : Query optimization and slow query analysis using EXPLAIN and ANALYZE Experience with database performance monitoring tools (e.g., pg_stat_activity, pgBadger) Strong debugging and troubleshooting of locking, deadlocks, and resource contention & DevOps Integration : PostgreSQL experience on AWS RDS, Azure Database for PostgreSQL, or GCP Cloud SQL Familiarity with IaC tools like Terraform or CloudFormation is a plus Experience with CI/CD integration and containerization tools (Docker, Kubernetes) for DB & Compliance : Implement role-based access control, data masking, and audit logging Ensure compliance with standards like GDPR, ISO 27001, or SOC 2 where : Bachelors or Masters degree in Computer Science, Information Technology, or a related field Minimum 4+ years of experience in PostgreSQL database administration PostgreSQL certification (e.g., EDB Certified Associate/Professional) is a plus Experience in 24x7 production environments supporting high-volume Experience : Exposure to multi-tenant architectures Experience migrating from Oracle/MySQL to PostgreSQL Knowledge of NoSQL systems (MongoDB, Redis) is a plus Understanding of data warehousing and ETL processes (ref:hirist.tech)

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JOB_POSTING-3-72171-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology

Posted 2 weeks ago

Apply

1.0 - 31.0 years

3 - 3 Lacs

Andheri East, Mumbai/Bombay

On-site

We are looking for a proactive and detail-oriented PL/SQL Developer with around 1 year of hands-on experience in database development using Oracle PL/SQL. The ideal candidate will be responsible for developing, testing, and maintaining database components and stored procedures for our applications and data workflows. Develop and maintain PL/SQL packages, procedures, functions, triggers, and views. Write efficient, reusable, and well-documented PL/SQL code to support business logic and reporting needs. Debug and optimize SQL queries to improve performance and ensure scalability. Collaborate with front-end/backend developers to design and integrate database components. Assist in data migration, transformation, and cleansing tasks. Participate in unit testing and provide support during QA/UAT phases. Understand business requirements and translate them into technical specifications. Perform basic performance tuning and troubleshooting of database issues. Proficiency in Oracle PL/SQL development (1 year). Good understanding of RDBMS concepts. Experience with SQL writing, joins, subqueries, indexes, and data types. Basic understanding of database performance tuning techniques. Familiarity with tools like SQL Developer, Toad, or DBeaver. Basic knowledge of version control systems (e.g., Git) is a plus. Strong analytical and problem-solving skills. Good verbal and written communication.

Posted 2 weeks ago

Apply

0 years

1 - 2 Lacs

India

On-site

Position Overview: An ASP.NET developer is responsible for designing and monitoring applications. The responsibilities include writing code, designing solutions for applications, and improving functionality. ( Roles and Responsibilities ): ● Creating top-quality applications. ● Designing and building application layers. ● Ensuring a required level of performance. ● Writing test-backed server-side code. ● Analysing requirements and designing new functionality. ● Supporting and fixing existing functionality. ● Selecting and using the most appropriate and efficient technologies for a particular application. ( Requirements and Qualifications ): ● A keen eye for detail. ● Strong communication skills. ● Extensive working knowledge of coding using .NET languages (C#, VB, NET). ● Familiarity with .NET framework ● Experience in HTML, CSS, Javascript, AJAX, and JQuery. ● Experience in Web API is preferred Knowledge of Bootstrap ● Sound knowledge of SQL Server ● Good understanding of multi-tier architecture application development. ● Database design including indexes and data integrity, if RDBMS is required. ● Understanding of HTML, JS, and CSS. ● Qualifications: M.tech, B.tech, MCA, BCAASP.NET Developer An ASP.NET developer is responsible for designing and monitoring applications. The responsibilities include writing code, designing solutions for applications and improving functionality. Note: The job is subject to the condition that the candidate either lives within a 15 km radius of the office or commits to relocating within this distance. Job Types: Full-time, Permanent Pay: ₹15,000.00 - ₹20,000.00 per month Benefits: Health insurance Leave encashment Provident Fund Schedule: Day shift Morning shift Work Location: In person

Posted 2 weeks ago

Apply

3.5 years

6 - 7 Lacs

Chennai

Remote

Job Title: PL/SQL Developer Chennai, OMR CTC: 6.5 to 7LPA Interview Mode : 1) Virtual Interview 2) Telephonic Interview 3) Face to Face Interview Gender: Male Mandatory Skillset: ETL Datamigration SQL Queries Oracle Databased tool – Toad / SQL Developer Key Responsibilities Develop, test, and maintain complex PL/SQL packages, procedures, functions, and triggers for data processing and ETL tasks. Design and implement database schemas and objects such as tables, indexes, and views. Analyze business requirements and translate them into technical solutions using PL/SQL. Optimize SQL queries and database performance for high efficiency. Perform data analysisto support report generation and modify existing reports as needed. Develop migration scripts for data transfer between systems. Ensure compliance with security standardsto protectsensitive data. Provide technicalsupport for production systems, including troubleshooting and resolvingissues. Documenttechnicalspecifications and create reusable code forscalability. Required Skills Technical Skills: Proficiency in Oracle PL/SQL programming with experience in developing stored procedures, functions, and triggers. Strong understanding of relational database concepts(RDBMS) and performance tuningtechniques. Experience with ETL processes and data warehouse integration. Knowledge of advanced PL/SQL features like collections, ref cursors, dynamic SQL, and materialized views. Familiarity with toolslike SQL Developer, Toad, orsimilarIDEs. Exposure to Unix/Linux scripting is a plus. Soft Skills: Strong analytical and problem-solving abilities. Excellent communication skillsto interact with stakeholders and team members effectively. Attention to detail with a focus on accuracy in coding and testing. Ability to work both independently and in a team environment. Qualifications Bachelor’s degree in computerscience, Information Technology, or a related field (or equivalent experience) Proven experience (3.5 to 4+ years) in Oracle PL/SQL development Job Types: Full-time, Permanent Pay: ₹600,000.00 - ₹700,000.00 per year Benefits: Paid sick time Paid time off Work from home Schedule: Day shift Experience: PL/SQL: 3 years (Required) Data migration: 2 years (Required) ETL: 2 years (Required) Location: Chennai, Tamil Nadu (Required) Work Location: In person

Posted 2 weeks ago

Apply

2.0 years

3 - 6 Lacs

Erode

On-site

Position Overview: We are seeking a skilled Database Administrator with DevOps expertise to manage, optimize, and maintain our database systems while contributing to our cloud infrastructure on AWS or Azure. The ideal candidate will have over 2 years of experience in database management (SQL) and server management (AWS), ensuring seamless integration, high availability, and scalability. Key Responsibilities: Manage, maintain, and optimize database systems, ensuring data integrity and performance. Implement and manage database backup, recovery, and replication strategies. Automate deployment, scaling, and monitoring of database solutions. Monitor database performance and troubleshoot issues to ensure minimal downtime. Work with cloud platforms (AWS) to design, implement, and maintain infrastructure. Optimize SQL queries, indexes, and database schema for performance and efficiency. Implement security measures to protect data against unauthorized access and breaches. Maintain comprehensive documentation for database architecture, processes, and best practices. Collaborate with development and operations teams to align database strategies with project goals. Required Skills and Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. 2+ years of experience in database management with SQL. Proficiency in AWS server management. Hands-on experience with database tools and technologies (e.g., MySQL, PostgreSQL, SQL Server). Knowledge of DevOps tools like Docker, Kubernetes, Jenkins, and Terraform. Strong understanding of database design, normalization, and performance tuning. Experience with monitoring and logging tools (e.g., CloudWatch, Prometheus, Grafana). Excellent problem-solving and analytical skills. Strong communication and teamwork abilities. Preferred Skills: Certification in AWS, Azure, or Database Management. Experience with NoSQL databases like MongoDB or DynamoDB. Familiarity with CI/CD pipelines and Infrastructure as Code (IaC). Job Types: Full-time, Permanent Pay: ₹300,000.00 - ₹600,000.00 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Fixed shift Application Question(s): Can you relocated to Erode for Work ? Are you in support role ? Do you have the ability to create AWS environment from scratch ? How years of experience you have ? Education: Bachelor's (Preferred) License/Certification: Do you have AWS Certificate ? (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

8.0 years

6 - 6 Lacs

Noida

On-site

R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients’ experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to ‘make healthcare work better for all’ by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. We are looking for a Senior Database Engineer to administrator and maintain NoSQL and relational SQL Server/MySQL Databases. The candidate will be part of the team providing operations support on multiple No SQL clusters running on Azure and will be responsible to install, configure, monitor, design, implement and support our mission critical MongoDB, Cosmos, Couchbase and SQL Server environments. The ideal candidate should be a fast learner, eager, passionate about automating development and production environments, and enjoy the challenge of working in a highly distributed and dynamic Hybrid Cloud environment. As part of service-oriented team, the role will require the individual to collaborate effectively with other internal engineering teams to gather requirements to deliver on various database platforms. There will be plenty of opportunities for developing your skills, as we look to improve constantly with the latest technologies. Essential Responsibilities Create, Administer, Monitor, and Maintain multiple Elasticsearch, MongoDB, and Cosmos and Couchbase environments. Work with development teams to design and implement optimized NoSQL databases. Implement relational databases, tables, and table changes. Support application development for problem solving and performance tuning. Assist in administering, monitoring, and maintaining SQL Server environments, including for disaster recovery. Work on new and existing logical/physical database designs for applications and infrastructure. Provides after-hours support for database emergencies, routine scheduled maintenance, and database server patching. Works closely with the business and engineering teams to understand and plan for storage and database needs. Implementation, configuration, maintenance, and performance of SQL Server RDBMS systems, to ensure the availability and operational readiness (security, health, and performance) of our corporate applications in cloud (managing Cloud Infrastructure related to SQL Data Services in Azure). Assist app dev teams with complex query tuning and schema refinement. Utilize various tools to evaluate performance and implement remedies to improve performance, including tuning database parameters and SQL statements. Required Qualifications 8+ years of experience in working in Database, Data Management, or Engineering roles. 6+ years of progressive experience in high volume/high transaction data administration, with at least 3 years working with Microsoft Azure Cloud technologies. 6+ years of experience managing NoSQL databases such as Couchbase, MongoDB, CosmosDB. 2+ years of experience in ElasticSearch. 6+ years of experience in performance tuning and database monitoring utilizing techniques with query analysis, indexes, statistics, execution plans. Prior experience working with large (2tb+) transactional databases and across a large environment with hundreds to thousands of databases in-scope. DESIRED TECHNICAL SKILLS Ability to troubleshoot performance issues with NoSQL databases (Elasticsearch, MongoDB, Cosmos and Couchbase) Accurately recommend configuration changes for optimal performance of NoSQL databases (Elasticsearch, MongoDB, and Cosmos and Couchbase) Experience in the design, testing, implementation, maintenance, and control of the organization's NoSQL databases across multiple platforms, technologies, (for example physical, relational and object oriented) and computing environments. Ability to develop queries to extract information based on compounded search criteria. Strong expertise with relational databases (Microsoft SQL Server, MySQL is a plus) with enhanced troubleshooting and performance tuning skills. Fundamental proficiency in data modeling in practical applications of a moderate nature. Firm understanding of the most prominent Azure database technologies such as Azure SQL Database and Azure SQL Managed Instance. Backup, restore, secure, scale, monitor and tune an Azure SQL Database Experience translating environments into Azure Managed Instance and other Azure technologies will be given a strong preference. Nice to Haves Certifications in Azure/SQL Server/NoSQL Experience with Postgres and MySQL is a big plus but not mandatory. Knowledge of SQL monitoring tools SolarWinds DPA, RedGate etc. Service now and Azure DevOps experience Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visit: r1rcm.com Visit us on Facebook

Posted 2 weeks ago

Apply

0 years

8 - 20 Lacs

Noida

On-site

JOB TITLE: DB2 DBA LOCATION: Noida SKILL: DB2 DBA WORK TIMING: Rotational Shifts JOB DESCRIPTION A. Database Object Creation and Maintenance: 1. Create, Alter various DB2 objects such as Tablespaces, Tables, Stored procedures, Indexes etc. 2. Knowledge and expertise on DB2 utility execution. B. Data load and migration: 1. Knowledge and understanding of data migration process from VSAM to DB2. 2. Unload data from DB2 tables using UNLOAD utility. 3. Load data from files to Tables using LOAD utility. 4. Unload data from Image copy using UNLOAD or other utilities. 5. Migrate data from one environment to another. C. Database backup and Recovery: 1. Setup and maintain Image Copy and other DB2 housekeeping jobs. 2. Able to recover data from Image Copy using RECOVER utility. D. Database access and security control: 1. Follow the security and audit control procedures to GRANT/REVOLKE permissions on database objects. 2. Understand and provide Group level access on DB2 objects to RACF user groups. E. Support to Application team: 1. Monitor and troubleshoot batch job failures. 2. Handle and resolve all DB2 package related consistency token issues. 3. Assist application team to resolve performance issues for the SQLs and/or Stored procedures. F. Performance monitoring and tuning: 1. Monitor current active and historical threads using DB2 monitoring tool. 2. Identify potential bottleneck, lock contention in the active or historical threads and troubleshoot them. 3. Check overall Mainframe CPU utilization using SYSVIEW online panels. 4. SQL tuning by rewriting the SQLs or designing appropriate indexes. Job Type: Full-time Pay: ₹811,840.25 - ₹2,059,289.75 per year Benefits: Paid time off Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Yearly bonus Work Location: In person

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Requirements Description and Requirements Job Responsibilities Manages design, distribution, performance, replication, security, availability, and access requirements for large and complex IBM DB2/LUW databases from version 10.1 to 11.5/12.1 on AIX and RedHat Linux Designs and develops physical layers of databases to support various application needs; Implements back-up, recovery, archiving, conversion strategies, and performance tuning; Manages job scheduling, application release, database change and compliance. Participates in the design, implementation and maintainance of automated infrastructure solutions using Infrastructure as Code tools like Ansible, Elastic and Terraform. Participates in the develop and management of Azure DevOps CI/CD pipelines to automate infrastructure deployments using Ansible and Elastic. Cross platform database migration and upgradation. Identifies and resolves problems utilizing structured tools and techniques. Provides technical assistance and mentoring to staff in all aspects of database management; Consults and advises application development teams on database security, query optimization and performance. Writes scripts for automating DBA routine tasks and documents database maintenance processing flows per standards. Education, Technical Skills & Other Critical Requirement Education Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience. Experience (In Years) 7+ Years Total IT experience & 4+ Years relevant experience in UDB database Technical Skills 4+ years of related work experience with database design, installation configuration and implementation; knowledge of all key IBM DB2/LUW utilities such as HADR, Reorg, run stats, Load on (Linux/Unix/Windows) 3+ years Unix and Linux operating systems and 2+ years shell scripting. Extensive Experience in database Upgrades and Patching Working experience in cloud computing (Azure, AWS RDS, IBM Cloud PAK) Experience administering IBM Informix databases is a Big Plus. Working knowledge of backup and recovery utilities like Rubrik, Networker Management of database elements, including creation, alteration, deletion and copying of schemas, databases, tables, views, indexes, stored procedures, triggers, and declarative integrity constraints Working knowledge in IBM db2 LUW replication (Db2 SQL replication and Q Replication, a Queue -based Replication) as well as Using Third party tools for Replications. Working knowledge on Db2 tools. Explain plan, Db2 reorg, Db2 run stats Knowledge of data security (User Access, Groups and Roles). Should have ability to work closely with IBM-PMR to resolve any ongoing production issues. Knowledge on ITSM Processes including Change, Incident, Problem, Service Management using ServiceNow tools. Strong database analytical skills to improve application and database performance. Understanding of modern IT infrastructure such as Cloud Architecture as well as Agile DevOps Framework. Other Critical Requirements Automation tools and programming such as Ansible, Shell scripting and MS PowerShell Database monitoring with Observability tools (Elastic). Intermediate certification for IBM certified administrator (11.1+) is preferable Experience managing geographically distributed and culturally diverse workgroups with strong team management, leadership and coaching skills Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders. Project management experience in creating and delivering Business presentations. Demonstrate ability to work independently and in a team environment Ability to work 24*7 rotational shift to support for production, development, and test databases About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

Remote

The Role : In this role, you will be part of the development of enterprise and web applications within Search Services team. The candidate would need to be a solid individual contributor who has experience in doing agile development, who has worked on complex architecture like cloud, load balanced systems, Restful API, Search Engines, NoSQL databases, high-performing systems etc. An ideal candidate can create scalable, flexible technical solutions, understand and support existing systems, study their enterprise complexities and develop state of art systems with modern software development practices. The candidate needs to have good grasping abilities to pick up new technologies and frameworks. Responsibilities : Design & develop web and enterprise solutions to be flexible, scalable & extensible using Java/J2EE in AWS cloud environment. Good working experience in OO analysis & design using common design patterns Enforce good agile practices like test driven development, Continuous Integration and improvement. Implement enhancements to improve the reliability, performance, and usability of our applications. Motivation to learn innovative trade of programming, debugging and deploying Self-starter, with excellent self-study skills and growth aspirations A good team player with ability to meet tight deadlines in a fast-paced environment Excellent written and verbal communication skills. Flexible attitude, perform under pressure. Requirements: These are the most important skills, qualities, etc. that we’d like for this role. Hands-on in Java/ Web-services / Spring /Spring Boot Very Strong knowledge of databases and hands on MS SQL/MySQL/PostgreSQL and NoSQL DB Understanding of Open Search a big plus. Good understanding of Object oriented design, Design Patterns, Enterprise Integration Patterns Experience with troubleshooting and debugging techniques. Hands-on experience on AWS services. Has done development or debugging on Linux/ Unix platforms. Ability to work independently and as part of a team. Experience with DevOps practices and tools. Minimum 2 years of experience in software development or a related field. Good to Have: Machine Learning knowledge. Exposure to Capital Market domain preferred (Indexes, Equity etc.) Knowledge of RabbitMQ and Kafka Morningstar is an equal opportunity employer Morningstar’s hybrid work environment gives you the opportunity to work remotely and collaborate in-person each week. We’ve found that we’re at our best when we’re purposely together on a regular basis, at least three days each week. A range of other benefits are also available to enhance flexibility as needs change. No matter where you are, you’ll have tools and resources to engage meaningfully with your global colleagues. I10_MstarIndiaPvtLtd Morningstar India Private Ltd. (Delhi) Legal Entity

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Company: The healthcare industry is the next great frontier of opportunity for software development, and Health Catalyst is one of the most dynamic and influential companies in this space. We are working on solving national-level healthcare problems, and this is your chance to improve the lives of millions of people, including your family and friends. Health Catalyst is a fast-growing company that values smart, hardworking, and humble individuals. Each product team is a small, mission-critical team focused on developing innovative tools to support Catalyst’s mission to improve healthcare performance, cost, and quality. POSITION OVERVIEW: We are looking for a highly skilled Senior Database Engineer & Storage Expert with 5+ years of hands-on experience in managing and optimizing large-scale, high-throughput database systems. The ideal candidate will possess deep expertise in handling complex ingestion pipelines across multiple data stores and a strong understanding of distributed database architecture. The candidate will play a critical technical leadership role in ensuring our data systems are robust, performant, and scalable to support massive datasets ingested from various sources without bottlenecks. You will work closely with data engineers, platform engineers, and infrastructure teams to continuously improve database performance and reliability. performance bottlenecks. KEY RESPONSIBILITIES: • Query Optimization: Design, write, debug and optimize complex queries for RDS (MySQL/PostgreSQL), MongoDB, Elasticsearch, and Cassandra. • Large-Scale Ingestion: Configure databases to handle high-throughput data ingestion efficiently. • Database Tuning: Optimize database configurations (e.g., memory allocation, connection pooling, indexing) to support large-scale operations. • Schema and Index Design: Develop schemas and indexes to ensure efficient storage and retrieval of large datasets. • Monitoring and Troubleshooting: Analyze and resolve issues such as slow ingestion rates, replication delays, and performance bottlenecks. • Performance Debugging: Analyze and troubleshoot database slowdowns by investigating query execution plans, logs, and metrics. • Log Analysis: Use database logs to diagnose and resolve issues related to query performance, replication, and ingestion bottlenecks • Data Partitioning and Sharding: Implement partitioning, sharding, and other distributed database techniques to improve scalability. • Batch and Real-Time Processing: Optimize ingestion pipelines for both batch and real-time workloads. • Collaboration: Partner with data engineers and Kafka experts to design and maintain robust ingestion pipelines. • Stay Updated: Stay up to date with the latest advancements in database technologies and recommend improvements. REQUIRED SKILLS AND QUALIFICATIONS: • Database Expertise: Proven experience with MySQL/PostgreSQL (RDS), MongoDB, Elasticsearch, and Cassandra. • High-Volume Operations: Proven experience in configuring and managing databases for large-scale data ingestions. • Performance Tuning: Hands-on experience with query optimization, indexing strategies, and execution plan analysis for large datasets. • Database Internals: Strong understanding of replication, partitioning, sharding, and caching mechanisms. • Data Modeling: Ability to design schemas and data models tailored for high throughput use cases. • Programming Skills: Proficiency in at least one programming language (e.g., Python, Java, Go) for building data pipelines. • Debugging Proficiency: Strong ability to debug slowdowns by analyzing database logs, query execution plans, and system metrics. • Log Analysis Tools: Familiarity with database log formats and tools for parsing and analyzing logs. • Monitoring Tools: Experience with monitoring tools such as AWS CloudWatch, Prometheus, and Grafana to track ingestion performance. • Problem-Solving: Analytical skills to diagnose and resolve ingestion-related issues effectively. PREFERRED QUALIFICATIONS: • Certification in any of the mentioned database technologies. • Hands-on experience with cloud platforms such as AWS (preferred), Azure, or GCP. • Knowledge of distributed systems and large-scale data processing. • Familiarity with cloud-based database solutions and infrastructure. • Familiarity with large scale data ingestion tools like Kafka, Spark or Flink. EDUCATIONAL REQUIREMENTS: • Bachelor’s degree in computer science, Information Technology, or a related field. Equivalent work experience will also be considered

Posted 2 weeks ago

Apply

5.0 years

15 - 25 Lacs

Mumbai Metropolitan Region

On-site

Data Engineer – On-Site, India Industry: Enterprise Data Analytics & Digital Transformation Consulting. We architect and operationalize large-scale data platforms that power BI, AI, and advanced reporting for global clients across finance, retail, and manufacturing. Leveraging modern cloud services and proven ETL frameworks, our teams turn raw data into trusted, analytics-ready assets that accelerate business decisions. Role & Responsibilities Design, build, and optimize end-to-end ETL pipelines that ingest, cleanse, and transform high-volume datasets using SQL and ELT best practices. Create scalable data models and dimensional schemas to support reporting, dashboarding, and machine-learning use-cases. Develop and maintain batch and near-real-time workflows in Airflow or similar orchestration tools, ensuring fault tolerance and SLA compliance. Collaborate with analysts, data scientists, and product owners to translate business requirements into performant data solutions. Implement rigorous data quality checks, lineage tracking, and metadata management to guarantee trust and auditability. Tune queries, indexes, and storage partitions for cost-efficient execution across on-premise and cloud data warehouses. Skills & Qualifications Must-Have 5+ years hands-on experience as a Data Engineer or similar. Advanced SQL proficiency for complex joins, window functions, and performance tuning. Proven expertise in building ETL/ELT pipelines with tools such as Informatica, Talend, or custom Python. Solid understanding of dimensional modeling, star/snowflake schemas, and data-vault concepts. Experience with workflow orchestration (Airflow, Luigi, or equivalent) and version control (Git). Strong grasp of data quality frameworks and error-handling strategies. Preferred Exposure to cloud platforms (AWS Redshift, Azure Synapse, or Google BigQuery). Knowledge of containerization and CI/CD pipelines for data workloads. Familiarity with streaming technologies (Kafka, Kinesis) and real-time ETL patterns. Working knowledge of BI tools (Tableau, Power BI) and their data connectivity. Benefits & Culture Highlights Work with high-calibre data practitioners and cutting-edge cloud tech. Merit-driven growth path, certification sponsorships, and continuous learning stipends. Inclusive, innovation-first culture that rewards problem-solving and ownership. Skills: kafka,data warehouse,containerization,airflow,elt,luigi,error-handling strategies,git,aws redshift,talend,star schema,power bi,informatica,data vault,ci/cd,azure synapse,etl,sql,kinesis,performance tuning,data modeling,data quality frameworks,python,dimensional modeling,snowflake schema,tableau,google bigquery

Posted 2 weeks ago

Apply

0.0 - 2.0 years

6 - 7 Lacs

Chennai, Tamil Nadu

Remote

Job Title: PL/SQL Developer Chennai, OMR CTC: 6.5 to 7LPA Interview Mode : 1) Virtual Interview 2) Telephonic Interview 3) Face to Face Interview Gender: Male Mandatory Skillset: ETL Datamigration SQL Queries Oracle Databased tool – Toad / SQL Developer Key Responsibilities Develop, test, and maintain complex PL/SQL packages, procedures, functions, and triggers for data processing and ETL tasks. Design and implement database schemas and objects such as tables, indexes, and views. Analyze business requirements and translate them into technical solutions using PL/SQL. Optimize SQL queries and database performance for high efficiency. Perform data analysisto support report generation and modify existing reports as needed. Develop migration scripts for data transfer between systems. Ensure compliance with security standardsto protectsensitive data. Provide technicalsupport for production systems, including troubleshooting and resolvingissues. Documenttechnicalspecifications and create reusable code forscalability. Required Skills Technical Skills: Proficiency in Oracle PL/SQL programming with experience in developing stored procedures, functions, and triggers. Strong understanding of relational database concepts(RDBMS) and performance tuningtechniques. Experience with ETL processes and data warehouse integration. Knowledge of advanced PL/SQL features like collections, ref cursors, dynamic SQL, and materialized views. Familiarity with toolslike SQL Developer, Toad, orsimilarIDEs. Exposure to Unix/Linux scripting is a plus. Soft Skills: Strong analytical and problem-solving abilities. Excellent communication skillsto interact with stakeholders and team members effectively. Attention to detail with a focus on accuracy in coding and testing. Ability to work both independently and in a team environment. Qualifications Bachelor’s degree in computerscience, Information Technology, or a related field (or equivalent experience) Proven experience (3.5 to 4+ years) in Oracle PL/SQL development Job Types: Full-time, Permanent Pay: ₹600,000.00 - ₹700,000.00 per year Benefits: Paid sick time Paid time off Work from home Schedule: Day shift Experience: PL/SQL: 3 years (Required) Data migration: 2 years (Required) ETL: 2 years (Required) Location: Chennai, Tamil Nadu (Required) Work Location: In person

Posted 2 weeks ago

Apply

0.0 - 3.0 years

4 - 10 Lacs

Tardeo, Mumbai, Maharashtra

On-site

Role : Senior Dot Net Developer Job type: Full time. Role type: Technical. Location: Mumbai Mid ‐level Resource, with 3+ Years of Experience Extensive experience with Microsoft technologies including.NET, ASP.Net Core MVC, C#, MS SQL Server. WPF, WCF C#, ASP.NET, XML, XSL, scripting languages including JQuery/JavaScript and HTML. Working on ASP.Net Core2 MVC is added advantage. Good to have knowledge with SQL Server 2012, indexing and queries and SSIS/SSRS. Has Implemented Ajax Controls in C# .Net Projects Complete understanding of MS SQL Database. Data modelling to visualize database structure Good understanding of Reviewing query performance and optimizing code Designing and coding database tables to store the application’s data Creating database triggers, stored procedures & functions Creating table indexes to improve database performance Has experience in writing unit tests & performing unit tests on own code About Andesoft Consulting : Andesoft is a boutique interactive services shop strategically combining business analytics and design. The primary domain expertise covers, Web architecture, CMS, and CRM technologies Market and business analytics to achieve better market segmentation and campaign management Custom offline and on-line interactive applications. Some of the business verticals we specialize in include Health Care, Financial Services, and Public and Non-profit Sectors. Company Profile: http://www.andesoftconsulting.com Qualification & Experience: ● Engineering Graduate or Post Graduate. ● BS degree in Information Technology, Computer Science or equivalent ● 3 Years of Professional Experience. Qualification & Experience: ● Engineering Graduate or Post Graduate. ● BS degree in Information Technology, Computer Science or equivalent ● 3 Years of Professional Experience. Job Types: Full-time, Permanent Pay: ₹400,000.00 - ₹1,000,000.00 per year Location Type: In-person Schedule: Day shift Ability to commute/relocate: Tardeo, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: relevant work: 3 years (Required) Work Location: In person Expected Start Date: 10/07/2025

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Requirements Description and Requirements Education, Technical Skills & Other Critical Requirement Education Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience. Experience (In Years) 7+ Years Total IT experience & 4+ Years relevant experience in SQL Server + Sybase Database Technical Skills Database Management: Proficient in managing and administering SQL Server, Azure SQL Server, and Sybase databases, ensuring high availability and optimal performance. Data Infrastructure & Security: Expertise in designing and implementing robust data infrastructure solutions, with a strong focus on data security and compliance. Backup & Recovery: Skilled in developing and executing comprehensive backup and recovery strategies to safeguard critical data and ensure business continuity. Performance Tuning & Optimization: Adept at performance tuning and optimization of databases, leveraging advanced techniques to enhance system efficiency and reduce latency. Cloud Computing & Scripting: Experienced in cloud computing environments and proficient in operating system scripting, enabling seamless integration and automation of database operations. Management of database elements, including creation, alteration, deletion and copying of schemas. databases, tables, views, indexes, stored procedures, triggers, and declarative integrity constraints Strong database analytical skills to improve application performance. Strong knowledge in ITSM process and tools (ServiceNow). Ability to work 24*7 rotational shift to support the Database and Splunk platforms. Other Critical Requirements Automation tools and programming such as Ansible and Python Excellent Analytical and Problem-Solving skills Experience managing geographically distributed and culturally diverse workgroups with strong team management, leadership and coaching skills Excellent written and oral communication skills, including the ability to clearly communicate/articulate technical and functional issues with conclusions and recommendations to stakeholders. Prior experience in handling state side and offshore stakeholders Experience in creating and delivering Business presentations. Demonstrate ability to work independently and in a team environment About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Azure Data Engineer Experience Range: 5 to 10 years Location : Pune Skills: Azure Data Engineering, C#, Fabrics, ADF Synapse Python, SQL, Synapse/ADF Should have good hands on experience on building ETL flow using any ETL tool such as ADF etc. Should be able to understand the requirement and design the data flow diagram of ETL process end to end. Should have good hands-on experience writing complex SQL queries and advance concepts of SQL such has indexes,partition,filegroup,transaction etc. Should be able to understand the business requirement and develop end to end data pipelines using required tools/technologies. Excellent troubleshooting and good communication skills with good attention to details. Should have knowledge on designing optimized data processing based on volume of data. Able to create documentation that clearly explains the purpose of the data flow and its intended use. Able to make regular modifications to existing production code for error correction and adding new features. Experience using Visual Studio,SQL server management studio. Strong understanding of data warehousing concepts such as dimension,fact,schema ,data loading process, dimensional modeling and data mining. Flexible to learn and adopt tools/technologies used in project.

Posted 2 weeks ago

Apply

0 years

2 - 9 Lacs

Gurgaon

On-site

About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is looking for a well-rounded and driven leader to become a part of its Conferences Technology & Insight Analytics team, which is tasked with creating the reporting and analytics to support its Conference reporting operations. What you will do: Provide technical leadership and guidance to software development teams, guaranteeing alignment with project objectives and adherence to industry best practices. Leading and mentoring a team of software engineers, delegating responsibilities, offering support, and promoting a collaborative environment. Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What you will need Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Good to Have: Experience working with dataset ingestion, data model creation, reports, dashboards using Power BI. Experience with Python and Azure Function for data processing. Experience in other reporting tools like SSRS, Tableau, Power BI etc. Demonstrated Ability to use GIT, Jenkins and other change management tools. Good knowledge of database performance and tuning, troubleshooting and query optimization. Who you are : Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred. IT Professional with 7-10 yrs of experience in Data analytics, Cloud technologies and ETL development. Excellent communication and prioritization skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Strong desire to improve upon their skills in software development, frameworks, and technologies. Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-NS4 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101327 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 2 weeks ago

Apply

4.0 years

4 - 6 Lacs

Chennai

On-site

Company Profile: Founded in 1976, CGI is among the largest independent IT and business consulting services firms in the world. With 94,000 consultants and professionals across the globe, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Position - SQL Server DBA with Postgres Knowledge Job Title: SQL Server Database Administrator (DBA) Position: Software Engineer Experience: 4- 7 Years Category: Software Development/ Engineering Shift: Timing/rotation etc. details Main location: Mumbai, Chennai, Bangalore Employment Type: Full Time Job Description - We are seeking a skilled SQL Server Database Administrator (DBA) with hands-on experience in PostgreSQL to manage, maintain, and optimize our database infrastructure. The ideal candidate will play a critical role in ensuring the reliability, performance, and security of our SQL Server and PostgreSQL database environments, working closely with development and DevOps teams. Your future duties and responsibilities Key Responsibilities: Administer, monitor, and maintain Microsoft SQL Server and PostgreSQL databases across development, test, and production environments. Perform database performance tuning, query optimization, and capacity planning. Manage backups, restores, disaster recovery, and high availability strategies (e.g., Always On, Log Shipping, Replication, etc.). Support database migrations and conversions between SQL Server and PostgreSQL. Work with development teams to assist in the design of database schemas, indexes, and stored procedures. Automate regular DBA tasks using PowerShell, T-SQL, pgAdmin, or scripting languages. Monitor and analyze database performance using tools like SQL Server Profiler, Performance Monitor, and pg_stat_statements. Stay up to date with the latest database technologies, security patches, and industry best practices. Required qualifications to be successful in this role Required Qualifications: 4+ years of hands-on experience with SQL Server 1+ years of experience working with PostgreSQL in a production environment. Strong proficiency in T-SQL and PL/pgSQL. Experience with database migration tools and techniques (e.g., AWS DMS, ora2pg, pgloader). Familiarity with cloud platforms like Azure, AWS, or GCP and managed database services (e.g., Amazon RDS, Azure SQL, Cloud SQL). Solid understanding of indexing, partitioning, and query execution plans. Preferred Skills: Experience with Linux and Windows Server environments. Knowledge of CI/CD pipelines and integration of database changes in DevOps workflows. Familiarity with monitoring tools. Certification(s) in Microsoft SQL Server or PostgreSQL. Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

About Gartner IT Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Gartner is looking for a well-rounded and driven leader to become a part of its Conferences Technology & Insight Analytics team, which is tasked with creating the reporting and analytics to support its Conference reporting operations. What You Will Do Provide technical leadership and guidance to software development teams, guaranteeing alignment with project objectives and adherence to industry best practices. Leading and mentoring a team of software engineers, delegating responsibilities, offering support, and promoting a collaborative environment. Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What You Will Need Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must Have Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Good To Have Experience working with dataset ingestion, data model creation, reports, dashboards using Power BI. Experience with Python and Azure Function for data processing. Experience in other reporting tools like SSRS, Tableau, Power BI etc. Demonstrated Ability to use GIT, Jenkins and other change management tools. Good knowledge of database performance and tuning, troubleshooting and query optimization. Who You Are Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred. IT Professional with 7-10 yrs of experience in Data analytics, Cloud technologies and ETL development. Excellent communication and prioritization skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Strong desire to improve upon their skills in software development, frameworks, and technologies. Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101327 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

Remote

Role: Lead Quantitative Analyst, Quantitative Research The Group: Morningstar’s Quantitative Research Group is an integral part of the Data & Analytics Team with the aim of creating independent investment research and data-driven analytics designed to help investors and Morningstar achieve better outcomes by making better decisions. We utilize statistical rigor and large data sets to inform the methodologies we develop. Our research encompasses hundreds of thousands of securities across a broad range of asset classes, including equities, fixed income, structured credit, and funds. Morningstar is one of the largest independent sources of fund, equity, and credit data and research in the world, and our advocacy for investors’ interests is the foundation of our company. The Role: As a Lead Quantitative Analyst, you will work in the Calc Services and Portfolio Analytics Group, which is part of the Data & Analytics team dedicated to researching data-intensive products for the investment management industry. Most of the research is integrated into Morningstar’s core products (Direct, Advisor Workstation, etc.) and teams such as Morningstar Investment Management, Morningstar Indexes, Morningstar Credit Ratings, Pitchbook, Sustainalytics, etc. The ideal candidate will blend financial knowledge, investment and portfolio construction expertise, quantitative modeling skills, and operational know-how. This position reports to the Manager of Quantitative Research. Responsibilities: Support methodology development, quantitative model builds, and enhancements for core quantitative products and calculations such as Risk Model, Asset Flows Forecast, Quant Ratings, Equity Style Box, Portfolio Construction, etc. The ability to independently lead projects with limited hand-holding by taking ownership of key projects. Drive independent research and publish research papers in asset allocation analysis, portfolio optimization, risk models, ESG, fund flows, etc., using principles of modern portfolio theory and statistics. Leverage new structured and unstructured datasets to build new quantitative frameworks that will help investors make informed decisions. Highly organized and efficient, with the ability to multi-task and meet tight deadlines. Ensure compliance with regulatory and company policies and procedures. Participate in client conversations to understand ongoing investor issues while increasing the reach of Morningstar's quantitative offerings. Requirements: 5+ years of relevant investment/quantitative research experience with an emphasis on quantitative finance, mutual fund analysis, asset allocation, and/or portfolio construction. CFA, FRM, CQF, or postgraduate degrees in finance, economics, mathematics, or statistics are preferable. Good experience in developing finance/statistics-based applications using proven technologies such as R, Python, PySpark, and comfort in using Jupyter Notebooks. Understanding of both business and technical requirements, and the ability to serve as a conduit between product, research, technology, and external clients. Knowledge of statistical models (e.g., regression, forecasting, optimization, Monte Carlo simulations, etc.). Experience developing financial engineering/statistical applications on the cloud (AWS). Morningstar is an equal opportunity employer Morningstar’s hybrid work environment gives you the opportunity to work remotely and collaborate in-person each week. We’ve found that we’re at our best when we’re purposely together on a regular basis, at least three days each week. A range of other benefits are also available to enhance flexibility as needs change. No matter where you are, you’ll have tools and resources to engage meaningfully with your global colleagues. I10_MstarIndiaPvtLtd Morningstar India Private Ltd. (Delhi) Legal Entity

Posted 2 weeks ago

Apply

0 years

25 - 30 Lacs

Pune, Maharashtra, India

On-site

Senior Database Developer Experience: 8-10yrs Location: Pune Work Model: 5 days Work from Office Responsibilities Detailed JD: Design & Implement the database and data modelling. Creating Indexes, Views, complex Triggers, Stored Procedure, effective Functions, and appropriate store procedures to facilitate efficient data manipulation and data consistency. Designing Database, writing sequences, jobs, complex and dynamic queries in SQL, MySQL, ORACLE. Should have good grip on Keys, Constraints, Indexes, Joins, CTEs, Partitioning, Row Number() and Windows function, Temporary Tables, UDTs, Types of Union, Metalized views etc. Strong analytical and problem solving skills. Database Normalization & De-normalization. Good command in writing the complex logic. Troubleshoot, optimize, and tune SQL processes and complex SQL queries. Should Know how to write unit test cases. Understanding of Database Transactions and states. Design, implement and monitor queries and stored procedure and their performance. Able to debug programs and integrate the applications with third-party web services. Identify opportunities for improved performance in SQL operations and implementations. Understanding of database Backup, Restore & Maintenance. Should have the knowledge about development, testing, UAT and production environments and their databases. Knowledge of Working on SSRS and SSIS. Experience in SSAS will be plus. Good communication skills. Skills: normalization / de-normalization,stored procedures,data modeling,ssas,sql,database backup,tuning sql,performance tuning,ssrs,triggers,functions,unit testing,sql development,de-normalization,database maintenance,ssis,database design,database normalization,views,indexes,mysql,oracle pl/sql,oracle,partitioning,query optimization

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies