Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join our “Finance – Procure to Pay Team” at DHL Global Forwarding, Freight (DGFF) GSC – Global Service Centre! Job Title: Associate – Finance (P2P) Job Grade – N Job Location: Chennai Are you dynamic and results-oriented with a passion for logistics? Join our high-performing Global Shared Services Team (GSC) at DHL Global Forwarding, Freight (DGFF); a Great Place to Work certified organization and one of the “Top 20 most admired Shared Services Organizations in 2022” by the independent global Shared Services & Outsourcing Network (SSON). We are the captive Shared Service Provider for DHL Global Forwarding and DHL Freight (DGFF). We are an organization of more than 4,600 colleagues complemented by approximately 500 virtual FTE (i.e., bots applied in process automation). Our colleagues are based across six service delivery centers in Mumbai, Chennai, Chengdu, Manila, Bogota & Budapest. You will interact with people from all over the world and get the chance to a truly international organization. In this role, you will have the opportunity to deliver exceptional service within the Finance - Procure to Pay (P2P) Service line, supporting our DGFF regions and countries globally. The role will involve training to handle various activities including invoice processing, payment processing, query management, scanning and indexing, and managing month-end close activities. Key Responsibilities: To understand the requirement of the station’s / country’s documentation and ensure jobs are executed as per standard operating procedures. Ensure department SLAs and all Key Performance Indicators are being met as per the agreed delivery guidelines. Deliver a high level of service quality through timely and accurate completion of services. Collaborate with colleagues within the business to identify solutions, best practices, and opportunities to improve the service to our business partners. Flag any challenges in the operations to the immediate supervisor and business partner in a timely manner. Co-ordinate with the relevant stakeholders for regular communication and flow of information as defined for the respective service. Required Skills/Abilities: Bachelor´s degree. A degree in logistics, industrial engineering, management will be an advantage 0 – 3 years of job experience from BPO or logistics domain - Preferred Good knowledge in MS office Effective English communication skills, written and verbal Exposure to working with Enterprise Resource Platforms (ERPs) Detail oriented Good logical reasoning skills High level of customer centricity Apply now and embark on an exciting journey with us! We offer: We recognize and reward your hard work through a competitive compensation and performance-based incentive. We empower you to learn and grow through training that gives you the knowledge, skills, and abilities to develop into your role and a great range of resources to support your future career aspirations & personal development. Flexible work arrangements to support work/life balance. Generous paid time off: Privilege (earned leave). Comprehensive medical insurance coverage including voluntary parental cover (applicable for IN only) Recognition & Engagement culture By joining one of the world's leading logistics companies, you have a chance to explore a wide range of interesting job challenges and opportunities across our GSC service lines and in our different divisions around the globe.
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join our “Finance – Procure to Pay Team” at DHL Global Forwarding, Freight (DGFF) GSC – Global Service Centre! Job Title: Associate – Finance (P2P) Job Grade – N Job Location: Chennai Are you dynamic and results-oriented with a passion for logistics? Join our high-performing Global Shared Services Team (GSC) at DHL Global Forwarding, Freight (DGFF); a Great Place to Work certified organization and one of the “Top 20 most admired Shared Services Organizations in 2022” by the independent global Shared Services & Outsourcing Network (SSON). We are the captive Shared Service Provider for DHL Global Forwarding and DHL Freight (DGFF). We are an organization of more than 4,600 colleagues complemented by approximately 500 virtual FTE (i.e., bots applied in process automation). Our colleagues are based across six service delivery centers in Mumbai, Chennai, Chengdu, Manila, Bogota & Budapest. You will interact with people from all over the world and get the chance to a truly international organization. In this role, you will have the opportunity to deliver exceptional service within the Finance - Procure to Pay (P2P) Service line, supporting our DGFF regions and countries globally. The role will involve training to handle various activities including invoice processing, payment processing, query management, scanning and indexing, and managing month-end close activities. Key Responsibilities: To understand the requirement of the station’s / country’s documentation and ensure jobs are executed as per standard operating procedures. Ensure department SLAs and all Key Performance Indicators are being met as per the agreed delivery guidelines. Deliver a high level of service quality through timely and accurate completion of services. Collaborate with colleagues within the business to identify solutions, best practices, and opportunities to improve the service to our business partners. Flag any challenges in the operations to the immediate supervisor and business partner in a timely manner. Co-ordinate with the relevant stakeholders for regular communication and flow of information as defined for the respective service. Required Skills/Abilities: Bachelor´s degree. A degree in logistics, industrial engineering, management will be an advantage 0 – 3 years of job experience from BPO or logistics domain - Preferred Good knowledge in MS office Effective English communication skills, written and verbal Exposure to working with Enterprise Resource Platforms (ERPs) Detail oriented Good logical reasoning skills High level of customer centricity Apply now and embark on an exciting journey with us! We offer: We recognize and reward your hard work through a competitive compensation and performance-based incentive. We empower you to learn and grow through training that gives you the knowledge, skills, and abilities to develop into your role and a great range of resources to support your future career aspirations & personal development. Flexible work arrangements to support work/life balance. Generous paid time off: Privilege (earned leave). Comprehensive medical insurance coverage including voluntary parental cover (applicable for IN only) Recognition & Engagement culture By joining one of the world's leading logistics companies, you have a chance to explore a wide range of interesting job challenges and opportunities across our GSC service lines and in our different divisions around the globe.
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Join our “Finance – Procure to Pay Team” at DHL Global Forwarding, Freight (DGFF) GSC – Global Service Centre! Job Title: Associate – Finance (P2P) Job Grade – N Job Location: Chennai Are you dynamic and results-oriented with a passion for logistics? Join our high-performing Global Shared Services Team (GSC) at DHL Global Forwarding, Freight (DGFF); a Great Place to Work certified organization and one of the “Top 20 most admired Shared Services Organizations in 2022” by the independent global Shared Services & Outsourcing Network (SSON). We are the captive Shared Service Provider for DHL Global Forwarding and DHL Freight (DGFF). We are an organization of more than 4,600 colleagues complemented by approximately 500 virtual FTE (i.e., bots applied in process automation). Our colleagues are based across six service delivery centers in Mumbai, Chennai, Chengdu, Manila, Bogota & Budapest. You will interact with people from all over the world and get the chance to a truly international organization. In this role, you will have the opportunity to deliver exceptional service within the Finance - Procure to Pay (P2P) Service line, supporting our DGFF regions and countries globally. The role will involve training to handle various activities including invoice processing, payment processing, query management, scanning and indexing, and managing month-end close activities. Key Responsibilities: To understand the requirement of the station’s / country’s documentation and ensure jobs are executed as per standard operating procedures. Ensure department SLAs and all Key Performance Indicators are being met as per the agreed delivery guidelines. Deliver a high level of service quality through timely and accurate completion of services. Collaborate with colleagues within the business to identify solutions, best practices, and opportunities to improve the service to our business partners. Flag any challenges in the operations to the immediate supervisor and business partner in a timely manner. Co-ordinate with the relevant stakeholders for regular communication and flow of information as defined for the respective service. Required Skills/Abilities: Bachelor´s degree. A degree in logistics, industrial engineering, management will be an advantage 0 – 3 years of job experience from BPO or logistics domain - Preferred Good knowledge in MS office Effective English communication skills, written and verbal Exposure to working with Enterprise Resource Platforms (ERPs) Detail oriented Good logical reasoning skills High level of customer centricity Apply now and embark on an exciting journey with us! We offer: We recognize and reward your hard work through a competitive compensation and performance-based incentive. We empower you to learn and grow through training that gives you the knowledge, skills, and abilities to develop into your role and a great range of resources to support your future career aspirations & personal development. Flexible work arrangements to support work/life balance. Generous paid time off: Privilege (earned leave). Comprehensive medical insurance coverage including voluntary parental cover (applicable for IN only) Recognition & Engagement culture By joining one of the world's leading logistics companies, you have a chance to explore a wide range of interesting job challenges and opportunities across our GSC service lines and in our different divisions around the globe.
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Hey There 👋 At Saleshandy, we're building the Cold Email Outreach platform of the future. We're building a product toward eliminating manual processes and helping companies generate more replies/book more meetings / generate leads (faster). Since our founding in 2016, we've grown to become a profitable, 100% geographically dispersed team of 65+ high-performing happy people who are dedicated to building a product that our customers love. What’s the Role About? Ever wondered how Saleshandy schedules millions of emails and still feels lightning-fast? Behind that magic is performance engineering. We’re hiring a Performance Engineer who thrives on making systems faster, leaner, and more reliable across backend, frontend, and infrastructure. Your mission: eliminate latency, fix CPU/memory bottlenecks, optimize queries, tame queues, and guide teams to build with performance in mind. This isn’t just about fire-fighting, it’s about owning speed as a product feature. You’ll work across the stack and use deep diagnostics, smart tooling, and system intuition to make things fly. Why Join Us? Purpose: Your work will directly impact page speeds, email throughput, scale. At Saleshandy, performance isn’t a luxury, it’s part of our premium promise. Growth: You’ll operate across multiple teams and tech layers, Node.js, MySQL, Redis, React, Kafka, ClickHouse, AWS, with the freedom to shape how we build fast systems. Motivation: If you’ve ever celebrated shaving 500ms off a page load, or chased a memory leak across 3 services just for fun, this is your home. We celebrate engineers who care about P99s, flamegraphs, and cache hits. Your Main Goals Identify and Eliminate Backend Bottlenecks (within 90 days) Run deep diagnostics using Clinic.js, heap snapshots, GC logs, and flamegraphs. Tackle high CPU/memory usage, event loop stalls, and async call inefficiencies in Node.js. Goal: Cut backend P95 response times by 30–40% for key APIs. Optimize MySQL Query Performance & Configuration (within 60 days) Use slow query logs, EXPLAIN, Percona Toolkit, and indexing strategies to tune queries and schema. Tune server-level configs like innodb_buffer_pool_size. Target: Eliminate top 10 slow queries and reduce DB CPU usage by 25%. Improve Frontend Performance & Load Time (within 90 days) Audit key frontend flows using Lighthouse, Core Web Vitals, asset audits. Drive improvements via lazy loading, tree-shaking, and code splitting. Goal: Get homepage and dashboard load times under 1.5s for 95% users. Make Infra & Monitoring Observability-First (within 120 days) Set up meaningful alerts and dashboards using Grafana, Loki, Tempo, Prometheus. Lead infra-level debugging — thread stalls, IO throttling, network latency. Goal: Reduce time-to-detect and time-to-resolve for perf issues by 50%. Important Tasks First 30 Days – System Performance Audit Do a full audit of backend, DB, infra, and frontend performance. Identify critical pain points and quick wins. Debug a Live Performance Incident Catch and resolve a real-world performance regression. Could be Node.js memory leak, a slow MySQL join, or Redis job congestion. Share a full RCA and fix. Create and Share Performance Playbooks (by Day 45) Build SOPs for slow query debugging, frontend perf checks, Redis TTL fixes, or Node.js memory leaks. Turn performance tuning into team sport. Guide Teams on Performance-Aware Development (within 90 days) Create internal micro-trainings or async reviews to help devs write faster APIs, reduce DB load, and spot regressions earlier. Use AI or Smart Tooling in Diagnostics Try out tools like Copilot for test coverage, or use AI-powered observability tools (e.g. Datadog AI, Loki queries, etc.) to accelerate diagnostics. Build Flamegraph/Profiling Baselines Set up and maintain performance profiling baselines (using Clinic.js, 0x, etc.) so regressions can be caught before they ship. Review Queues and Caching Layer Identify performance issues in Redis queues — retries, TTL delays, locking — and tune caching strategies across app and DB. Contribute to Performance Culture Encourage tracking of real metrics: TTI, DB query time, API P95s. Collaborate with product and engineering to define what “fast enough” means. Experience Level: 3–5 years Tech Stack: Node.js, MySQL, Redis, Grafana, Prometheus, Clinic.js, Percona Toolkit Culture Fit – Are You One of Us? We're a fast-moving, globally distributed SaaS team where speed matters not just in product, but in how we work. We believe in ownership, system thinking, and real accountability. If you like solving hard problems, value simplicity, and hate regressions, you’ll thrive here.
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Specialist, Software Development Engineering J What does a successful database developer - Professional do at Fiserv? A successful database developer -Professional at Fiserv do database design, optimizes database performance, and mentors junior developers, demonstrating deep expertise in SQL, PL/SQL, and database design principles. What Will You Do Develop, maintain, and optimize PL/SQL code for efficient database operations. Design and implement database solutions to support business requirements. Troubleshoot and debug complex database issues. Collaborate with cross-functional teams to gather requirements and provide technical guidance. Perform code reviews and ensure adherence to coding standards. Optimize database performance through query optimization and indexing strategies. Create and maintain technical documentation for developed solutions. What Will You Need To Have Bachelor’s degree in computer science, Information Technology, or related field. 5 years of hands-on experience in SQL and PL/SQL development. Proficiency in Oracle database technologies. Strong understanding of database design principles and normalization. Work on security scans and pipelines for code deployment. Excellent problem-solving and analytical skills. Ability to work independently and collaboratively in a team environment. Effective communication skills to interact with stakeholders at all levels. What Would Be Great To Have Powerbuilder Knowledge Oracle certification (e.g., Oracle PL/SQL Developer Certified Associate). Experience with performance tuning and optimization techniques. Knowledge of other programming languages such as Shell Scripting, Java, Python/Perl, or C/C++. Familiarity with Agile development methodologies. Expertise in version control systems (e.g., Git) and CI/CD (Github actions, Harness etc) Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 week ago
200.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Driving excellence in client account services through strategic innovation and transformational change. Job Summary As a Reference Data Analyst within the Client Account Services team, you will support in-flight documentation remediation projects across various products. You will work with client agreements and related documentation, ensuring they meet specific requirements and are accurately maintained. Your role involves contributing to a cohesive team, providing regular updates, and applying a control-mindset to all tasks. Job Responsibilities Support Client Account Services activities for documentation remediation projects. Work across several products and remediation projects, focusing on regulatory and risk-related issues. Perform functions across reference data setups, including searching, scanning, and indexing legal documents. Maintain client records and ensure documents meet specific outlined requirements. Contribute to the wider team and provide regular progress updates. Approach work with a control-mindset and demonstrate an understanding of policies and procedures. Maintain an understanding of client documents and their requirements. Required Qualifications, Skills, And Capabilities Strong verbal and written communication skills. Good team player and self-motivated. Strong analytical skills. Proven skills in time management, organization, and attention to detail. Preferred Qualifications, Skills, And Capabilities Desire to work in a fast-paced environment with multiple deliverables. Proficiency in Microsoft Office suite of applications. Experience in working with client agreements and documentation. Ability to apply a control-mindset to tasks and projects. Preferred experienced with Reference Data, client documentation and JPM systems ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.
Posted 1 week ago
5.0 - 10.0 years
4 - 7 Lacs
Hyderabad
Work from Office
JD: Capability to conduct and coordinate SAPPPM/PS specific design workshops with various business stakeholders Capability to drive system requirement specific discussions and conclude key decisions to help and support SAPPPM/PS design Capability to configure SAPPPM/PS modules with functionality/feature such as, Characteristics of Portfolio Items and Initiatives Portfolio and Classification hierarchies with Financial and Capacity planning views Portfolio, Initiative and Item related field configuration and custom enhancements Project and Investment profile configuration Network profile and relevant configuration CATS profileconfiguration based on various scenarios (Internal and externalresourcetimesheet) ProgressAnalysis configuration Settlement profiles and strategies Capability to o timely identify WRICEF items with respect to SAPPPM/PS, o prepare Functional specifications (with good insights regarding SAP provided BADIs/BAPIs/User exits/etc. inPPM/PS), and o coordinate development objects with technical team Testing of SAPPPM/PSconfiguration and Integration testing with Finance/SCM/EAM/HR Capability to coordinate cutover activity (guidance in TR sequencing, red book entries and preparing and executing LSMW)
Posted 1 week ago
6.0 - 11.0 years
7 - 11 Lacs
Gurugram
Hybrid
Skills: Oracle Database, Postgres, Database design Secondary Skills: Data modelling, Performance Tuning, ETL processes, Automating Backup and Purging Processes Skill Justification Database Designing, Data Modelling, and Core Component Implementation: These are fundamental skills for a DBA. Database designing involves creating the structure of the database, data modelling is about defining how data is stored, accessed, and related, and core component implementation ensures that the database is set up correctly and efficiently. Data Integration and Relational Data Modelling: Data integration is crucial for combining data from different sources into a unified view, which is essential for accurate reporting and analysis. Relational data modelling helps in organizing data into tables and defining relationships, which is a core aspect of managing relational databases. Optimization and Performance Tuning: Optimization and performance tuning are critical for ensuring that the database runs efficiently. This involves analyzing and improving query performance, indexing strategies, and resource allocation to prevent bottlenecks and ensure smooth operation. Automating Backup and Purging Processes: Automating backup and purging processes is vital for data integrity and storage management. Regular backups protect against data loss, while purging old or unnecessary data helps maintain database performance and manage storage costs.
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
SQL DEVELOPER: Design and implement relational database structures optimized for performance and scalability. Develop and maintain complex SQL queries, stored procedures, triggers, and functions. Optimize database performance through indexing, query tuning, and regular maintenance. Ensure data integrity, consistency, and security across multiple environments. Collaborate with cross-functional teams to integrate SQL databases with applications and reporting tools. Develop and manage ETL (Extract, Transform, Load) processes for data ingestion and transformation. Document database architecture, processes, and procedures for future reference. Stay updated with the latest SQL best practices and database technologies.Data Retrieval: SQL Developers must be able to query large and complex databases to extract relevant data for analysis or reporting. Data Transformation: They often clean, join, and reshape data using SQL to prepare it for downstream processes like analytics or machine learning. Performance Optimization: Writing queries that run efficiently is key, especially when dealing with big data or real-time systems. Understanding of Database Schemas: Knowing how tables relate and how to navigate normalized or denormalized structures is essential. Minimum Experience- 5 Years Mandatory Skills- SQL, ETL, Data Retrieval, Data Transformation, Performance Optimization, Machine Learning, Schema, Data Integration, PyTest, Custom Python/SQL scripts, Python or another scripting language for test automation, Cloud Platform, Data Warehousing solutions. Share your resume at aarushi.shukla@coforge.com if you are a core SQL Developer and an immediate joiner.
Posted 1 week ago
8.0 years
15 - 22 Lacs
India
Remote
Job Description For PostgreSQL Lead Job Title: PostgreSQL Lead Company: Mydbops About Us As a seasoned industry leader for 8 years in open-source database management, we specialise in providing unparalleled solutions and services for MySQL, MariaDB, MongoDB, PostgreSQL, TiDB, Cassandra, and more. At Mydbops, we are committed to providing exceptional service and building lasting relationships with our customers. Our Customer Account Management team is vital in ensuring client satisfaction and loyalty. Role Overview As the PostgreSQL Lead , you will own the design, implementation, and operational excellence of PostgreSQL environments. You’ll lead technical decision-making, mentor the team, interface with customers, and drive key initiatives covering performance tuning, HA architectures, migrations, and cloud deployments. Key Responsibilities Lead PostgreSQL production environments: architecture, stability, performance, and scalability Oversee complex troubleshooting, query optimization, and performance analysis Architect and maintain HA/DR systems (e.g., Streaming Replication, Patroni, repmgr) Define backup, recovery, replication, and failover protocols Guide DB migrations, patches, and upgrades across environments Collaborate with DevOps and cloud teams for infrastructure automation Use monitoring (pg_stat_statements, PMM, Nagios or any monitoring stack) to proactively resolve issues Provide technical mentorship—conduct peer reviews, upskill, and onboard junior DBAs Lead customer interactions: understand requirements, design solutions, and present proposals Drive process improvements and establish database best practices Requirements Experience: 4-5 years in PostgreSQL administration, with at least 2+ years in a leadership role Performance Optimization: Expert in query tuning, indexing strategies, partitioning, and execution plan analysis. Extension Management: Proficient with critical PostgreSQL extensions including: pg_stat_statements – query performance tracking pg_partman – partition maintenance pg_repack – online table reorganization uuid-ossp – UUID generation pg_cron – native job scheduling auto_explain – capturing costly queries Backup & Recovery: Deep experience with pgBackRest, Barman, and implementing Point-in-Time Recovery (PITR). High Availability & Clustering: Proven expertise in configuring and managing HA environments using Patroni, repmgr, and streaming replication. Cloud Platforms: Strong operational knowledge of AWS RDS and Aurora PostgreSQL, including parameter tuning, snapshot management, and performance insights. Scripting & Automation: Skilled in Linux system administration, with advanced scripting capabilities in Bash and Python. Monitoring & Observability: Familiar with pg_stat_statements, PMM, Nagios, and building custom dashboards using Grafana and Prometheus. Leadership & Collaboration: Strong problem-solving skills, effective communication with stakeholders, and experience leading database reliability and automation initiatives. Preferred Qualifications Bachelor’s/Master’s degree in CS, Engineering, or equivalent PostgreSQL certifications (e.g., EDB, AWS) Consulting/service delivery experience in managed services or support roles Experience in large-scale migrations and modernization projects Exposure to multi-cloud environments and DBaaS platforms What We Offer Competitive salary and benefits package. Opportunity to work with a dynamic and innovative team. Professional growth and development opportunities. Collaborative and inclusive work environment. Job Details Work time: General shift Working days: 5 Days Mode of Employment - Work From Home Experience - 4-5 years Skills:- PostgreSQL, Linux/Unix, MongoDB and Amazon Web Services (AWS)
Posted 1 week ago
7.0 years
0 Lacs
India
Remote
For one of my customers, we are looking for a Senior Data Engineer. Location : India (100% remote) Duration: 6+ months (renewable) Description: We are looking for a Senior Data Engineer with strong experience in SQL, Talend, and cloud platforms such as Google Cloud (BigQuery) and Microsoft Azure. You will be responsible for designing and managing ETL pipelines, optimizing SQL performance, and building cloud-based data solutions. Deliverables: Develop and optimize complex SQL queries , stored procedures, and indexing strategies for large datasets. Design and maintain ETL/ELT data pipelines using Talend , integrating data from multiple sources. Architect and optimize data storage solutions on GCP BigQuery and Azure SQL / Synapse Analytics . Implement best practices for data governance, security, and compliance in cloud environments. Work closely with data analysts, scientists, and business teams to deliver scalable solutions. Monitor, troubleshoot, and improve data pipeline performance and reliability. Automate data workflows and scheduling using orchestration tools (eg, Apache Airflow, Azure Data Factory). Lead code reviews, mentoring, and best practices for junior engineers. Required Qualifications 7+ years of hands-on experience in SQL development, database performance tuning, and ETL processes . Expert-level proficiency in SQL , including query optimization, stored procedures, indexing, and partitioning. Strong experience with Talend for ETL/ELT development. Hands-on experience with GCP BigQuery and Azure SQL / Synapse Analytics . Solid understanding of data modeling (relational & dimensional) and cloud-based data architectures . Proficiency in Python or Shell scripting for automation and workflow management. Familiarity with CI/CD, Git, and DevOps best practices for data engineering. Nice to Have Experience with Apache Airflow or Azure Data Factory for workflow automation. Knowledge of real-time data streaming (Kafka, Pub/Sub, Event Hubs). Cloud certifications in GCP or Azure (eg: Google Professional Data Engineer, Azure Data Engineer Associate).
Posted 1 week ago
3.0 years
0 Lacs
Manesar, Haryana, India
On-site
Job Title: Sales & Marketing Executive Experience: 3+ Years CTC: ₹11–12 LPA Qualification: B.Tech (Mandatory) Job Summary: We are looking for a result-oriented Sales & Marketing professional with strong expertise in Product Costing, RFQ Management, RM/FE Indexing, Customer Claims, and Proposal Handling . The ideal candidate should have a solid understanding of budgeting and long-term sales planning , and be capable of managing customer relationships with a strategic and customer-centric approach. Key Responsibilities: Prepare & manage RFQs, customer proposals, and product costing Analyze RM/FE trends & implement cost optimization strategies Handle customer claims & coordinate for resolutions Support budget planning & long-term sales forecasting Collaborate with cross-functional teams for business growth Key Skills: Strong in RFQ & costing Knowledge of RM/FE indexing Excellent customer handling & communication skills Proficient in Excel, data analysis, and budgeting Kindly share CV on hrsuperconsultants@gmail.com
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Key Responsibilities: Design, develop and maintain complex PL/SQL code and stored procedure to support business logic and data processing requirements. Develop and maintain data models for new and existing applications ensuring scalability, flexibility and integrity. Perform database performance tuning and query optimizations to enhance system efficiency. Ensure data quality standards are met through robust validation, transformation and cleansing processes. Administer and support Oracle databases ensuring high availability, backup and recovery and proactive monitoring. Provide database maintenance and support including patching, upgrades and troubleshooting production issues. Monitor system health and implement proactive alerts and diagnostics for performance and capacity management. Support and implement solutions using NoSQL database (e.g. MongoDB, Cassandra) where appropriate for specific use case. Collaborate with development teams to ensure code quality and best practices in PL/SQL development. Develop and maintain documentation related to data architecture, data flow and database processes. Required Skills and Qualificatioons: 8-12 years of experience in Oracle database development and administration, Mondgo DB Development and administration String proficiency in PL/SQL, SQL, Indexes, Triggers, Sequences, procedures, functions and Oracle database internals. Proven experience with data modeling (logical and physical), normalization and schema design Expertise in performance tuning, query optimization and indexing strategies. Hands-on experience with relational database (Oracle, PostgresSQL, MySQL) and NoSQL database (e.g. MongoDB, Cassandra) Solid understanding of database monitoring tools and tunning methodologies. Experience in data quality management practices and tools Strong exposure to code quality standards and tools for PL/SQL development (e.g. TOAD, SQL Developer, SonarQube) Knowledge of backup and recovery strategies, high availability and disaster recovery planning Familiarity with DevOps practices, including database CI/CD integration. Familiarity with Liquibase for database versioning and deployment automation. Preferred Skills: Experience in cloud-based databases (e.g. Oracle Cloud, AWS RDS, Azure SQL) Exposure to ETL tools and data integration platforms. Familiarity with regulatory compliance standards (e.g. GDPR, HIPPA) related to data management. Soft Skills: Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Ability to work independently and as part of a cross-functional team. String attention to detail and commitment to data integrity. Education: Bachelor’s degree/University degree or equivalent experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 1 week ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
DXFactor is a US-based tech company working with customers across the globe. We are a Great place to work with certified company. We are looking for candidates for Data Engineer (4 to 6 Yrs exp) We have our presence in: US India (Ahmedabad, Bangalore) Location : Ahmedabad Website : www.DXFactor.com Designation: Data Engineer (Expertise in SnowFlake, AWS & Python) Key Responsibilities Design, develop, and maintain scalable data pipelines for batch and streaming workflows Implement robust ETL/ELT processes to extract data from various sources and load into data warehouses Build and optimize database schemas following best practices in normalization and indexing Create and maintain documentation for data flows, pipelines, and processes Collaborate with cross-functional teams to translate business requirements into technical solutions Monitor and troubleshoot data pipelines to ensure optimal performance Implement data quality checks and validation processes Build and maintain CI/CD workflows for data engineering projects Stay current with emerging technologies and recommend improvements to existing systems Requirements Bachelor's degree in Computer Science, Information Technology, or related field Minimum 4+ years of experience in data engineering roles Strong proficiency in Python programming and SQL query writing Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) Experience with data warehousing technologies (e.g., Snowflake, Redshift, BigQuery) Proven track record in building efficient and scalable data pipelines Practical knowledge of batch and streaming data processing approaches Experience implementing data validation, quality checks, and error handling mechanisms Working experience with cloud platforms, particularly AWS (S3, EMR, Glue, Lambda, Redshift) and/or Azure (Data Factory, Databricks, HDInsight) Understanding of different data architectures including data lakes, data warehouses, and data mesh Demonstrated ability to debug complex data flows and optimize underperforming pipelines Strong documentation skills and ability to communicate technical concepts effectively
Posted 1 week ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
To manage, store & standardize commissioning related documentation on Autodesk Construction Cloud & CxAlloy platform and support the HO team in various administrative tasks. Qualifications and Experience Bachelor of Engineering (B.E) in any domain Post Graduation / MBA preferred Certified/ Proficient in handling Autodesk Construction Cloud Proficiency in handling MS Excel Aptitude to learn commissioning tools & software like CxAlloy Key Responsibilities of Role Minimum 4 Years of experience 1. Centralized Document Repository Management Create, organize, and maintain a centralized repository for all Testing & Commissioning (T&C) documentation across ACX projects Ensure version control, proper indexing, and secure access protocols for all stored documents. Archive obsolete documents per ACX retention policies and ensure traceability for future reference 2. Autodesk Construction Cloud (ACC) Proficiency & aptitude to learn other commissioning softwares Manage document workflows within ACC, including uploading, tagging, and submitting documents for review Ensure documents are stored in the correct folders (e.g., Commissioning Folder) and follow the ACC submittal process tailored for ACX India projects Collaborate with BCEI and third-party CxA teams to align ACC usage with global and local standards 3. Commissioning Documentation Oversight Understand and implement the ACX version of BCEI Book of Rules for documentation, including naming conventions, cover sheets, and discipline-specific workflows (Electrical, Mechanical, Fire, Plumbing) Manage submittals such as Method of Statement (MoS), Inspection Test Plans (ITP), FAT/FWT scripts, Cx scripts (L2–L5), Energization Plans, QAQC Plans, and calibration certificates Ensure all Cx documentation is reviewed and approved through the designated workflow involving all stakeholders. Assist in filing claims, booking travel tickets, and managing training budgets for the HO Testing & Commissioning team Coordinate with internal stakeholders and external vendors to ensure timely execution of administrative tasks. 5. Compliance & Quality Assurance Ensure all documentation complies with ACX Integrated Management System (IMS) procedures and quality standards Support audits by maintaining accurate records and facilitating document retrieval for review. 6. Communication & Coordination Liaise with consultants, contractors, and internal teams to ensure timely document submissions and approvals Provide updates to stakeholders on document status, revisions, and access protocols. 7. Training & Process Improvement Support onboarding and training of site teams on ACC workflows and documentation standards. Identify gaps in documentation practices and recommend improvements to enhance efficiency and compliance.
Posted 1 week ago
1.0 - 3.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
About This Role About BlackRock BlackRock is one of the world’s preeminent asset management firms and a premier provider of global investment management, risk management and advisory services to institutional, intermediary, and individual investors around the world. BlackRock offers a range of solutions — from rigorous fundamental and quantitative active management approaches aimed at maximizing outperformance to highly efficient indexing strategies designed to gain broad exposure to the world’s capital markets. Our clients can access our investment solutions through a variety of product structures, including individual and institutional separate accounts, mutual funds and other pooled investment vehicles, and the industry-leading iShares® ETFs. About BlackRock US Municipals Group The Municipal Fixed Income business in BlackRock’s Portfolio Management Group is one of the largest managers of municipal securities in the industry with over $180 billion of assets under management. The platform offers a broad array of investment choices including mandates dedicated to both taxable and tax-exempt bonds. BlackRock seeks an analyst to join a newly established Mumbai-based team working with dedicated Municipal Credit Research staff currently located in both Gurgaon and Princeton, New Jersey, USA. The analyst will have close daily interaction with a variety of senior investment professionals on the Municipals team. Position Overview The responsibilities of a Municipal Credit Research Analyst include: Acquire and manage financial and economic data sets to build out the firm’s municipal credit database and complete timely surveillance of existing bond positions Conduct fundamental credit analysis and provide written opinions on public municipal debt issuers across a variety of sectors – including US State and Local Governments, Utilities, Transportation, Healthcare, and Higher Education Establish strong connections and collaborations with analysts, portfolio managers, data scientists, and other internal BlackRock stakeholders Develop connections with rating agencies and sell-side traders/analysts focused on the municipal market Qualifications Undergraduate degree in Finance, Accounting, or a quantitative field Excellent written and oral communication skills 1-3 years fixed income market or credit research experience; knowledge of the municipal bond market preferred Ability to work during times that at least partially overlap with financial markets hours in eastern US (approximately 12:00-21:00 IST) Technical skills – SQL, Python, VBA, or other programming languages are preferred Demonstrated knowledge of Microsoft Office suite Progress towards CFA designation Key Competencies Proven experience working both independently and as part of a team in a highly collaborative, global environment Understanding of financial statements Solid communication and analytical skills Passion for financial markets Familiarity with statistical modeling, data visualizations, and project management is a plus We Are Looking For People Who Are Tech-Savvy: You want to find new ways to outsmart the problem Curious: You love learning, and you have a self-starting personality Passionate: You own the work you do Open: You value and respect input from others and want to be part of a team Experimental: You learn from mistakes Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 1 week ago
7.0 years
0 Lacs
India
Remote
Senior Data Warehouse Engineer Asian Hires India (Remote) POSITION REPORTS TO: Senior Manager Application Development DEPARTMENT: Information Technology POSITION LOCATION: INDIA (REMOTE) COMPANY BUDGET: 18,00,000 - 23,00,000 Job Summary We are seeking a highly skilled Senior Data Warehouse Engineer to manage a single version of the truth of data, convert business entities into facts and dimensions, and integrate data from various sources. The role involves developing ETL processes, optimizing performance, and collaborating with cross-functional teams to support business analysis and decision-making. Key Responsibilities Data Warehouse Development: Responsible for managing a single version of the truth and turning data into critical information and knowledge that can be used to make sound business decisions. Dimensional Modeling: Convert business entities into facts and dimensions to provide a structured and efficient data model that supports accurate and insightful business analysis. Data Integration: Collaborate with cross-functional teams to integrate data from various source systems such as Oracle NetSuite, Salesforce, Ragic, SQL Server, MySQL, and Excel files. Data Transformation: Develop and maintain ETL processes using Microsoft SSIS, Python or similar ETL tools to load data into the data warehouse. Performance Optimization: Optimize queries, indexes, and database structures to improve efficiency. Requirement Analysis: Work closely with key users and business stakeholders to define requirements. Documentation: Maintain comprehensive technical documentation for data warehouse processes, data integration, and configurations. Team Collaboration: Mentor and guide junior team members, fostering a collaborative environment. Required Skills & Qualifications Must have o Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. o 7+ years of experience with Microsoft SQL Server. o Expertise in building Data Warehouse using SQL Server. o Hands-on experience Dimensional Modeling using Facts and Dimensions. o Expertise in SSIS and Python for ETL development. o Strong experience in Power BI for reporting and data visualization. o Strong understanding of relational database design, indexing, and performance tuning. o Ability to write complex SQL scripts, stored procedures and views. o Experience with Git and JIRA. o Problem-solving mindset and analytical skills. o Excellent communication and documentation abilities. Nice to Have o Experience with cloud-based SQL databases (e.g., Azure SQL, Azure Synapse). o Experience with cloud-based ETL solutions (Azure Data Factory, Azure Databricks). o Familiarity with CI/CD for database deployments and automation tools. o Knowledge of big data technologies like Snowflake
Posted 1 week ago
2.0 - 4.0 years
5 - 9 Lacs
Bengaluru
Remote
We’re seeking a SQL Developer to build and manage scalable, high-performance queries and procedures. Key Responsibilities : Design, write, and maintain complex SQL queries and stored procedures. Optimize database performance and indexing. Ensure data integrity and support ETL workflows. Collaborate with analysts and backend engineers. Required Qualifications: 2+ years of experience in SQL development. Deep knowledge of RDBMS concepts and performance tuning. Experience with PostgreSQL, MySQL, or MS SQL Server.
Posted 1 week ago
2.0 - 4.0 years
5 - 9 Lacs
Kolkata
Remote
We’re seeking a SQL Developer to build and manage scalable, high-performance queries and procedures. Key Responsibilities : Design, write, and maintain complex SQL queries and stored procedures. Optimize database performance and indexing. Ensure data integrity and support ETL workflows. Collaborate with analysts and backend engineers. Required Qualifications: 2+ years of experience in SQL development. Deep knowledge of RDBMS concepts and performance tuning. Experience with PostgreSQL, MySQL, or MS SQL Server.
Posted 1 week ago
2.0 - 4.0 years
5 - 9 Lacs
Mumbai
Remote
We’re seeking a SQL Developer to build and manage scalable, high-performance queries and procedures. Key Responsibilities : Design, write, and maintain complex SQL queries and stored procedures. Optimize database performance and indexing. Ensure data integrity and support ETL workflows. Collaborate with analysts and backend engineers. Required Qualifications: 2+ years of experience in SQL development. Deep knowledge of RDBMS concepts and performance tuning. Experience with PostgreSQL, MySQL, or MS SQL Server.
Posted 1 week ago
2.0 - 4.0 years
5 - 9 Lacs
Hyderabad
Remote
We’re seeking a SQL Developer to build and manage scalable, high-performance queries and procedures. Key Responsibilities : Design, write, and maintain complex SQL queries and stored procedures. Optimize database performance and indexing. Ensure data integrity and support ETL workflows. Collaborate with analysts and backend engineers. Required Qualifications: 2+ years of experience in SQL development. Deep knowledge of RDBMS concepts and performance tuning. Experience with PostgreSQL, MySQL, or MS SQL Server.
Posted 1 week ago
15.0 years
0 Lacs
India
Remote
About Us MyRemoteTeam, Inc is a fast-growing distributed workforce enabler, helping companies scale with top global talent. We empower businesses by providing world-class software engineers, operations support, and infrastructure to help them grow faster and better. Job Title: AWS Cloud Architecture Experience: 15+ Years Mandatory Skills ✔ 15+ years in Java Full Stack (Spring Boot, Microservices, ReactJS) ✔ Cloud Architecture: AWS EKS, Kubernetes, API Gateway (APIGEE/Tyk) ✔ Event Streaming: Kafka, RabbitMQ ✔ Database Mastery: PostgreSQL (performance tuning, scaling) ✔ DevOps: GitLab CI/CD, Terraform, Grafana/Prometheus ✔ Leadership: Technical mentoring, decision-making About the Role We are seeking a highly experienced AWS Cloud Architect with 15+ years of expertise in full-stack Java development , cloud-native architecture, and large-scale distributed systems. The ideal candidate will be a technical leader capable of designing, implementing, and optimizing high-performance cloud applications across on-premise and multi-cloud environments (AWS). This role requires deep hands-on skills in Java, Microservices, Kubernetes, Kafka, and observability tools, along with a strong architectural mindset to drive innovation and mentor engineering teams. Key Responsibilities ✅ Cloud-Native Architecture & Leadership: Lead the design, development, and deployment of scalable, fault-tolerant cloud applications (AWS EKS, Kubernetes, Serverless). Define best practices for microservices, event-driven architecture (Kafka), and API management (APIGEE/Tyk). Architect hybrid cloud solutions (on-premise + AWS/GCP) with security, cost optimization, and high availability. ✅ Full-Stack Development: Develop backend services using Java, Spring Boot, and PostgreSQL (performance tuning, indexing, replication). Build modern frontends with ReactJS (state management, performance optimization). Design REST/gRPC APIs and event-driven systems (Kafka, SQS). ✅ DevOps & Observability: Manage Kubernetes (EKS) clusters, Helm charts, and GitLab CI/CD pipelines. Implement Infrastructure as Code (IaC) using Terraform/CloudFormation. Set up monitoring (Grafana, Prometheus), logging (ELK), and alerting for production systems. ✅ Database & Performance Engineering: Optimize PostgreSQL for high throughput, replication, and low-latency queries. Troubleshoot database bottlenecks, caching (Redis), and connection pooling. Design data migration strategies (on-premise → cloud). ✅ Mentorship & Innovation: Mentor junior engineers and conduct architecture reviews. Drive POCs on emerging tech (Service Mesh, Serverless, AI/ML integrations). Collaborate with CTO/Architects on long-term technical roadmaps.
Posted 1 week ago
5.0 years
5 - 7 Lacs
Thiruvananthapuram
On-site
5 - 7 Years 1 Opening Trivandrum Role description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes: Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain/technology certification Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Design, build, and maintain robust, reactive REST APIs using Spring WebFlux and Spring Boot Develop and optimize microservices that handle high throughput and low latency Write clean, testable, maintainable code in Java Integrate with MongoDB for CRUD operations, aggregation pipelines, and indexing strategies Apply best practices in API security, versioning, error handling, and documentation Collaborate with front-end developers, DevOps, QA, and product teams Troubleshoot and debug production issues, identify root causes, and deploy fixes quickly Required Skills & Experience: Strong programming experience in Java 17+ Proficiency in Spring Boot, Spring WebFlux, and Spring MVC Solid understanding of Reactive Programming principles Proven experience designing and implementing microservices architecture Hands-on expertise with MongoDB, including schema design and performance tuning Experience with RESTful API design and HTTP fundamentals Working knowledge of build tools like Maven or Gradle Good grasp of CI/CD pipelines and deployment strategies Skills Spring Webflux,Spring Boot,Kafka About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 1 week ago
4.0 years
10 - 10 Lacs
Hyderābād
Remote
Job Title: Senior Cloud DBA - SQL Server - Azure Primary - Powershell Automation.- Weekend Location: Hyderabad (Hybrid) Job Type: Full-time Job Summary: We are seeking an experienced and driven Senior Cloud DBA with 4–8 years of relevant experience managing large-scale, mission-critical SQL Server databases in AWS and Azure cloud environments, along with hands-on experience in other relational and NoSQL database technologies(e.g., PostgreSQL, MySQL, MongoDB). The ideal candidate will have hands-on experience with observability tools such as New Relic , RedGate , and SolarWinds DPA , and a solid understanding of FedRAMP High and IL5 security and compliance standards. This role demands a balance of operational expertise, automation mindset, security awareness, and cross-functional collaboration to ensure highly available, performant, and secure SQL Server instances across cloud platforms. Key Responsibilities: Administer, monitor, and support SQL Server databases deployed on AWS (RDS, EC2) and Azure (SQL DB, Managed Instance, SQL on VMs) . Manage backup/recovery strategies, patching, and high availability configurations ( AlwaysOn, Geo-replication, Log Shipping ). Tune SQL queries, manage indexing strategies, and analyze wait stats using tools such as SolarWinds DPA . Integrate and optimize observability tooling ( New Relic APM , RedGate Monitor , SQL Monitor ) for real-time health, alerts, and proactive diagnostics. Automate operational tasks using T-SQL, PowerShell, or scripting tools , and integrate with DevOps pipelines where applicable. Working knowledge of PostgreSQL , MySQL , MongoDB , Elasticsearch , and Redis . Collaborate with CloudOps, DevOps, and SRE teams on release management, deployment automation, and infrastructure as code (IaC). Ensure cloud resource utilization is cost-effective, applying tagging, right-sizing, and scheduled scaling. Maintain secure access control and auditing per FedRAMP High and IL5 security baselines; support internal/external audits and compliance reviews. Participate in on-call rotations for 24x7 coverage (follow-the-sun model) for incident response, issue triage, and problem resolution Work 4 x 10-hour days (Thursday through Sunday, approximately 10 AM to 9 PM IST, including lunch) Required Skills & Qualifications: 4–8 years of hands-on experience as a SQL Server DBA, with 2+ years in AWS or Azure cloud environments . Proficiency with SQL Server 2016+ , including installation, configuration, maintenance, performance tuning, and HA/DR. Solid expertise in monitoring/troubleshooting tools: New Relic (DB Insights & APM) SolarWinds DPA RedGate Automation scripting using PowerShell , T-SQL , and optionally Azure CLI / AWS CLI . Experience working in secure and regulated environments (FedRAMP, IL5, HIPAA, etc.). Understanding of cost optimization practices in cloud services. Strong interpersonal, documentation, and communication skills. Preferred Qualifications: Certifications (nice to have): Microsoft Certified: Azure Database Administrator Associate AWS Certified: Database – Specialty or Solutions Architect – Associate Exposure to FedRAMP High and IL5 operational requirements Roadmap to Success First 90 Days: Getting Oriented Understand Current Systems: Onboard with DBA, CloudOps, and Observability teams. Review existing SQL Server deployments and architectures in AWS and Azure . Review and Assess: Analyse current monitoring, alerting, automation scripts, and security configurations. Understand cloud spend patterns, backup retention policies, and DR architecture. Quick Wins: Resolve immediate performance bottlenecks. Deploy minor automation and alerting enhancements. Improve observability coverage using RedGate and SolarWinds DPA. First 6 Months: Adding Value Optimization & Stability: Tune queries and optimize configurations across production systems. Strengthen backup, recovery, and HA strategies . Automation & Cost Control: Build and contribute scripts for regular maintenance, scaling, and reporting. Implement cloud cost monitoring dashboards and provide actionable insights. Cross-Functional Collaboration: Work closely with DevOps and SRE teams to improve pipeline and deployment reliability. First 12 Months: Driving Impact Reliability & Security Enhancements: Enforce security standards aligned to FedRAMP High and IL5 , including encryption, RBAC, and audit logging. Support compliance reviews and audit preparation. Innovation & Leadership: Propose modernization plans (e.g., serverless databases, containerization, Azure Arc). Mentor peers and contribute to internal technical forums or runbooks. Operational Excellence: Help create a fully automated, secure, and observable SQL Server environment across cloud platforms. Be recognized as the go-to SQL Server expert for cloud-native solutions and secure database operations. What We Offer: Competitive salary and performance incentives Remote-friendly, flexible work culture Cloud training programs and technical certifications Exposure to large-scale cloud architecture and DevOps best practices Collaborative, fast-paced engineering environment
Posted 1 week ago
3.0 years
10 - 10 Lacs
Hyderābād
Remote
Job Title: Cloud DBA (Azure is Primary) - Powershell automation. Location: India (Hybrid) Job Type: Full-time Job Summary: We are seeking an experienced and driven Junior Cloud DBA with 3–5 years of relevant experience managing large-scale, mission-critical SQL Server databases in AWS and Azure cloud environments, along with hands-on experience in other relational and NoSQL database technologies(e.g., PostgreSQL, MySQL, MongoDB). The ideal candidate will have hands-on experience with observability tools such as New Relic , RedGate , and SolarWinds DPA , and a solid understanding of FedRAMP High and IL5 security and compliance standards. This role demands a balance of operational expertise, automation mindset, security awareness, and cross-functional collaboration to ensure highly available, performant, and secure SQL Server instances across cloud platforms. Key Responsibilities: Administer, monitor, and support SQL Server databases deployed on AWS (RDS, EC2) and Azure (SQL DB, Managed Instance, SQL on VMs) . Manage backup/recovery strategies, patching, and high availability configurations ( AlwaysOn, Geo-replication, Log Shipping ). Tune SQL queries, manage indexing strategies, and analyze wait stats using tools such as SolarWinds DPA . Integrate and optimize observability tooling ( New Relic APM , RedGate Monitor , SQL Monitor ) for real-time health, alerts, and proactive diagnostics. Automate operational tasks using T-SQL, PowerShell, or scripting tools , and integrate with DevOps pipelines where applicable. Working knowledge of PostgreSQL , MySQL , MongoDB , Elasticsearch , and Redis . Collaborate with CloudOps, DevOps, and SRE teams on release management, deployment automation, and infrastructure as code (IaC). Ensure cloud resource utilization is cost-effective, applying tagging, right-sizing, and scheduled scaling. Maintain secure access control and auditing per FedRAMP High and IL5 security baselines; support internal/external audits and compliance reviews. Participate in on-call rotations for 24x7 coverage (follow-the-sun model) for incident response, issue triage, and problem resolution Required Skills & Qualifications: 3–5 years of hands-on experience as a SQL Server DBA, with 2+ years in AWS or Azure cloud environments . Proficiency with SQL Server 2016+ , including installation, configuration, maintenance, performance tuning, and HA/DR. Solid expertise in monitoring/troubleshooting tools: New Relic (DB Insights & APM) SolarWinds DPA RedGate Automation scripting using PowerShell , T-SQL , and optionally Azure CLI / AWS CLI . Experience working in secure and regulated environments (FedRAMP, IL5, HIPAA, etc.). Understanding of cost optimization practices in cloud services. Strong interpersonal, documentation, and communication skills. Preferred Qualifications: Certifications (nice to have): Microsoft Certified: Azure Database Administrator Associate AWS Certified: Database – Specialty or Solutions Architect – Associate Exposure to FedRAMP High and IL5 operational requirements Roadmap to Success First 90 Days: Getting Oriented Understand Current Systems: Onboard with DBA, CloudOps, and Observability teams. Review existing SQL Server deployments and architectures in AWS and Azure . Review and Assess: Analyse current monitoring, alerting, automation scripts, and security configurations. Understand cloud spend patterns, backup retention policies, and DR architecture. Quick Wins: Resolve immediate performance bottlenecks. Deploy minor automation and alerting enhancements. Improve observability coverage using RedGate and SolarWinds DPA. First 6 Months: Adding Value Optimization & Stability: Tune queries and optimize configurations across production systems. Strengthen backup, recovery, and HA strategies . Automation & Cost Control: Build and contribute scripts for regular maintenance, scaling, and reporting. Implement cloud cost monitoring dashboards and provide actionable insights. Cross-Functional Collaboration: Work closely with DevOps and SRE teams to improve pipeline and deployment reliability. First 12 Months: Driving Impact Reliability & Security Enhancements: Enforce security standards aligned to FedRAMP High and IL5 , including encryption, RBAC, and audit logging. Support compliance reviews and audit preparation. Innovation & Leadership: Propose modernization plans (e.g., serverless databases, containerization, Azure Arc). Mentor peers and contribute to internal technical forums or runbooks. Operational Excellence: Help create a fully automated, secure, and observable SQL Server environment across cloud platforms. Be recognized as the go-to SQL Server expert for cloud-native solutions and secure database operations. What We Offer: Competitive salary and performance incentives Remote-friendly, flexible work culture Cloud training programs and technical certifications Exposure to large-scale cloud architecture and DevOps best practices Collaborative, fast-paced engineering environment
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France