Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
We are looking for 8+years experienced candidates for this role. Job Location: Technopark, Trivandrum Experience 8+ years of experience in Microsoft SQL Server administration Primary skills Strong experience in Microsoft SQL Server administration Qualifications Bachelor's degree in computer science, software engineering or a related field Microsoft SQL certifications (MTA Database, MCSA: SQL Server, MCSE: Data Management and Analytics) will be an advantage. Secondary Skills Experience in MySQL, PostgreSQL, and Oracle database administration. Exposure to Data Lake, Hadoop, and Azure technologies Exposure to DevOps or ITIL Main duties/responsibilities Optimize database queries to ensure fast and efficient data retrieval, particularly for complex or high-volume operations. Design and implement effective indexing strategies to reduce query execution times and improve overall database performance. Monitor and profile slow or inefficient queries and recommend best practices for rewriting or re-architecting queries. Continuously analyze execution plans for SQL queries to identify bottlenecks and optimize them. Database Maintenance: Schedule and execute regular maintenance tasks, including backups, consistency checks, and index rebuilding. Health Monitoring: Implement automated monitoring systems to track database performance, availability, and critical parameters such as CPU usage, memory, disk I/O, and replication status. Proactive Issue Resolution: Diagnose and resolve database issues (e.g., locking, deadlocks, data corruption) proactively, before they impact users or operations. High Availability: Implement and manage database clustering, replication, and failover strategies to ensure high availability and disaster recovery (e.g., using tools like SQL Server Always On, Oracle RAC, MySQL Group Replication).
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
This is Adyen Adyen provides payments, data, and financial products in a single solution for customers like Meta, Uber, H&M, and Microsoft - making us the financial technology platform of choice. At Adyen, everything we do is engineered for ambition. For our teams, we create an environment with opportunities for our people to succeed, backed by the culture and support to ensure they are enabled to truly own their careers. We are motivated individuals who tackle unique technical challenges at scale and solve them as a team. Together, we deliver innovative and ethical solutions that help businesses achieve their ambitions faster. Data Engineer We are looking for a Data Engineer to join the Payment Engine Data team in Bengaluru, our newest Adyen office. The main goal of the Payment Engine Data (PED) team is to provide insightful data and solutions for processing payments using all of Adyen's payment options. These consist of various data pipelines between various systems, dashboards offering insights into payment processing, internal and external reporting, additional data products, and infrastructure. The ideal candidate is able to understand the business context and relate it to the underlying data requirements. You should also excel at building top-notch data pipelines on our big data platform. At Adyen, your work as a Data Engineer will be vital in forming our data infrastructure and guaranteeing the seamless flow of data across various systems. What You’ll Do Develop High-Quality Data Pipelines- Design, develop, deploy, and operate ETL/ELT pipelines in PySpark. Your work will directly contribute to the creation of reports, tools, analytics, and datasets for both internal and external use. Collaborative Solution Development- Partner with various teams, engineers, and data analysts to understand data requirements and transform these insights into effective data pipelines. Orchestrate Data Flow- Utilise orchestration tools to manage data pipelines efficiently, experience in Airflow is a significant advantage. Champion Data Best Practices- Advocate for performance, testing, code quality, data validation, data governance, and discoverability. Ensure that the data provided is accurate, performant, and reliable. Performance Optimisation- Identify and resolve performance bottlenecks in data pipelines and systems. Optimise query performance and resource utilisation to meet SLAs and performance requirements, using technologies such as caching, indexing, partitioning, and other Spark optimizations. Knowledge Sharing and Training- Scale your knowledge throughout the organisation, enhancing the overall data literacy. Who You Are Experienced in Big Data- At least 5 years of experience working as a Data Engineer or in a similar role. Data & Engineering practices- You possess an expert-level understanding of both Software and Data Engineering practices. Technical Super Star- Highly proficient in tools and languages such as- Python, PySpark, Airflow, Hadoop, Spark, Kafka, SQL, Git, S3. Looker is a plus. Clear Communicator- Skilled at articulating complex data-related concepts and outcomes to a diverse range of stakeholders. Self-starter- Capable of independently recognizing opportunities, devising solutions, leading, prioritizing and owning projects. Innovator- You have an experimental mindset with a ‘launch fast and iterate’ mentality. Data Culture Champion- Experienced in fostering a data-centric culture within large, technical organizations and setting standards for excellence and continuous improvement. Data Positions At Adyen We know companies handle different definitions for their data-related positions, this is for instance dependent on the size of a company. We categorized and defined all our positions. Have a look at this blogpost to find out! Our Diversity, Equity and Inclusion commitments Our unique approach is a product of our diverse perspectives. This diversity of backgrounds and cultures is essential in helping us maintain our momentum. Our business and technical challenges are unique, and we need as many different voices as possible to join us in solving them - voices like yours. No matter who you are or where you’re from, we welcome you to be your true self at Adyen. Studies show that women and members of underrepresented communities apply for jobs only if they meet 100% of the qualifications. Does this sound like you? If so, Adyen encourages you to reconsider and apply. We look forward to your application! What’s next? Ensuring a smooth and enjoyable candidate experience is critical for us. We aim to get back to you regarding your application within 5 business days. Our interview process tends to take about 4 weeks to complete, but may fluctuate depending on the role. Learn more about our hiring process here. Don’t be afraid to let us know if you need more flexibility. This role is based out of our Bengaluru office. We are an office-first company and value in-person collaboration; we do not offer remote-only roles.
Posted 2 weeks ago
6.0 years
0 Lacs
Karnataka, India
Remote
About Us MyRemoteTeam, Inc is a fast-growing distributed workforce enabler, helping companies scale with top global talent. We empower businesses by providing world-class software engineers, operations support, and infrastructure to help them grow faster and better. Job Title: Senior Java Spring Boot Developer Experience: 6+ Years Location: Mysore and Pune Job Description: We are seeking an experienced Senior Java Spring Boot Developer with 6+ years of hands-on experience in building scalable, high-performance microservices using Java, Spring Boot, and Spring JPA. The ideal candidate will have strong expertise in designing and developing RESTful APIs, microservices architecture, and cloud-native applications. As part of our team, you will work on enterprise-grade applications, collaborate with cross-functional teams, and contribute to the full software development lifecycle. Mandatory Skills: ✔ 6+ years of Java development (Java 8/11/17). ✔ Strong Spring Boot & Spring JPA experience. ✔ Microservices architecture (design, development, deployment). ✔ RESTful API development & integration. ✔ Database expertise (SQL/NoSQL – PostgreSQL, MySQL, MongoDB). ✔ Testing frameworks (JUnit, Mockito). ✔ Agile methodologies & CI/CD pipelines. Key Responsibilities: Design & Development: Develop high-performance, scalable microservices using Spring Boot. Design and implement RESTful APIs following best practices. Use Spring JPA/Hibernate for database interactions (SQL/NoSQL). Implement caching mechanisms (Redis, Ehcache) for performance optimization. Microservices Architecture: Build and maintain cloud-native microservices (Docker, Kubernetes). Integrate with message brokers (Kafka, RabbitMQ) for event-driven systems. Ensure fault tolerance, resilience, and scalability (Circuit Breaker, Retry Mechanisms). Database & Performance: Optimize database queries (PostgreSQL, MySQL, MongoDB). Implement connection pooling, indexing, and caching strategies. Monitor and improve application performance (JVM tuning, profiling). Testing & Quality Assurance: Write unit & integration tests (JUnit, Mockito, Test Containers). Follow TDD/BDD practices for robust code quality. Perform code reviews and ensure adherence to best practices. DevOps & CI/CD: Work with Docker, Kubernetes, and cloud platforms (AWS/Azure). Set up and maintain CI/CD pipelines (Jenkins, GitHub Actions). Automate deployments and monitoring (Prometheus, Grafana). Collaboration & Agile: Work in Agile/Scrum teams with daily standups, sprint planning, and retrospectives. Collaborate with frontend, QA, and DevOps teams for seamless delivery.
Posted 2 weeks ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are looking for a seasoned AWS DevOps Engineer with robust experience in AWS middleware services and MongoDB Cloud Infrastructure Management. The role involves designing, deploying, and maintaining secure, scalable, and high-availability infrastructure, along with developing efficient CI/CD pipelines and automating operational processes. Key Deliverables (Essential functions & Responsibilities of the Job) : Design, deploy, and manage AWS infrastructure, with a focus on middleware services such as API Gateway, Lambda, SQS, SNS, ECS, and EKS. Administer and optimize MongoDB Atlas or equivalent cloud-based MongoDB solutions for high availability, security, and performance. Develop, manage, and enhance CI/CD pipelines using tools like AWS CodePipeline, Jenkins, GitHub Actions, GitLab CI/CD, or Bitbucket Pipelines. Automate infrastructure provisioning using Terraform, AWS CloudFormation, or AWS CDK. Implement monitoring and logging solutions using CloudWatch, Prometheus, Grafana, or the ELK Stack. Enforce cloud security best practices — IAM, VPC setups, encryption, certificate management, and compliance controls. Work closely with development teams to improve application reliability, scalability, and performance. Manage containerized environments using Docker, Kubernetes (EKS), or AWS ECS. Perform MongoDB administration tasks such as backups, performance tuning, indexing, and sharding. Participate in on-call rotations to ensure 24/7 infrastructure availability and quick incident resolution. Knowledge Skills and Abilities: 7+ years of hands-on AWS DevOps experience, especially with middleware services. Strong expertise in MongoDB Atlas or other cloud MongoDB services. Proficiency in Infrastructure as Code (IaC) tools like Terraform, CloudFormation, or AWS CDK. Solid experience with CI/CD tools: Jenkins, CodePipeline, GitHub Actions, GitLab, Bitbucket, etc. Excellent scripting skills in Python, Bash, or PowerShell. Experience in containerization and orchestration: Docker, EKS, ECS. Familiarity with monitoring tools like CloudWatch, ELK, Prometheus, Grafana. Strong understanding of AWS networking and security: IAM, VPC, KMS, Security Groups. Ability to solve complex problems and thrive in a fast-paced environment. Preferred Qualifications AWS Certified DevOps Engineer – Professional or AWS Solutions Architect – Associate/Professional. MongoDB Certified DBA or Developer. Experience with serverless services like AWS Lambda, Step Functions. Exposure to multi-cloud or hybrid cloud environments.
Posted 2 weeks ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionises customer engagement by transforming contact centres into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organisations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. Position Overview: We seek an experienced Staff Software Engineer to lead the design and development of our data warehouse and analytics platform in addition to helping raise the engineering bar for the entire technology stack at Level AI, including applications, platform, and infrastructure. They will actively collaborate with team members and the wider Level AI engineering community to develop highly scalable and performant systems. They will be a technical thought leader who will help drive solving complex problems of today and the future by designing and building simple and elegant technical solutions. They will coach and mentor junior engineers and drive engineering best practices. They will actively collaborate with product managers and other stakeholders both inside and outside the team. What you’ll get to do at Level AI (and more as we grow together): Design, develop, and evolve data pipelines that ingest and process high-volume data from multiple external and internal sources. Build scalable, fault-tolerant architectures for both batch and real-time data workflows using tools like GCP Pub/Sub, Kafka and Celery. Define and maintain robust data models with a focus on domain-oriented design , supporting both operational and analytical workloads. Architect and implement data lake/warehouse solutions using Postgres and Snowflake . Lead the design and deployment of workflow orchestration using Apache Airflow for end-to-end pipeline automation. Ensure platform reliability with strong monitoring, alerting, and observability for all data services and pipelines. Collaborate closely with Other internal product & engineering teams to align data platform capabilities with product and business needs. Own and enforce data quality, schema evolution, data contract practices, and governance standards. Provide technical leadership, mentor junior engineers , and contribute to cross-functional architectural decisions. We'd love to explore more about you if you have 8+ years of experience building large-scale data systems ; preferably in high-ingestion, multi-source environments. Strong system design, debugging, and performance tuning skills . Strong programming skills in Python and Java . Deep understanding of SQL (Postgres, MySQL) and data modeling (star/snowflake schema, normalization/denormalization). Hands-on experience with streaming platforms like Kafka and GCP Pub/Sub . Expertise with Airflow or similar orchestration frameworks. Solid experience with Snowflake , Postgres , and distributed storage design. Familiarity with Celery for asynchronous task processing. Comfortable working with ElasticSearch for data indexing and querying. Exposure to Redash , Metabase , or similar BI/analytics tools. Proven experience deploying solutions on cloud platforms like GCP or AWS . Compensation: We offer market-leading compensation, based on the skills and aptitude of the candidate. Preferred Attributes- Experience with data governance and lineage tools. Demonstrated ability to handle scale, reliability, and incident response in data systems. Excellent communication and stakeholder management skills. Passion for mentoring and growing engineering talent. To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/ Our AI platform : https://www.youtube.com/watch?v=g06q2V_kb-s
Posted 2 weeks ago
10.0 - 12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Senior PostgreSQL Database Administrator Location: Noida, India Experience Required: 10 to 12 years Education Qualification: B.Tech/B.E. in Computer Science, IT or MCA Job Summary: We are seeking an experienced and highly skilled PostgreSQL Database Administrator to join our team in Noida. The ideal candidate will be responsible for the performance, integrity, and security of our PostgreSQL database systems. This role requires deep expertise in database architecture, performance tuning, indexing, and backup/recovery strategies. Key Responsibilities: Install, configure, and maintain PostgreSQL databases across multiple environments (Dev/Test/Prod). Manage database security, integrity, and backup procedures. Design and implement robust backup and recovery solutions. Monitor database performance and proactively tune SQL queries and server configurations. Create and maintain database objects including tables, indexes, views, stored procedures, and triggers. Implement effective indexing and partitioning strategies to improve performance and scalability. Collaborate with development teams for schema design, query optimization, and application support. Plan and execute database migrations, upgrades, and patch management. Maintain high availability and disaster recovery setups. Automate routine DBA tasks using shell scripts or Python. Prepare and maintain documentation including SOPs, architecture diagrams, and incident reports. Provide on-call support and resolve critical database issues as needed. Required Skills and Experience: 9–12 years of proven experience as a PostgreSQL DBA in large-scale enterprise environments. Strong knowledge of PostgreSQL architecture and internals. Deep understanding of database design, performance tuning, indexing, and query optimization. Hands-on experience in implementing backup and disaster recovery solutions. Expertise in performance monitoring tools (e.g., pg_stat_statements, pgBadger). Proficiency with scripting languages (Bash, Python, or similar) for automation. Experience with replication (logical/streaming), partitioning, and connection pooling. Familiarity with Linux/Unix systems and cloud environments (AWS/GCP/Azure). Strong problem-solving skills and the ability to handle high-pressure situations. Preferred Qualifications: Certification in PostgreSQL administration or cloud-based database services. Experience with tools like Ansible, Terraform, or other infrastructure-as-code technologies. Exposure to other database technologies like MySQL or Oracle is a plus.
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As a Fullstack SDE - II at NxtWave, you Build applications at a scale and see them released quickly to the NxtWave learners (within weeks )Get to take ownership of the features you build and work closely with the product tea mWork in a great culture that continuously empowers you to grow in your caree rEnjoy freedom to experiment & learn from mistakes (Fail Fast, Learn Faster )NxtWave is one of the fastest growing edtech startups. Get first-hand experience in scaling the features you build as the company grows rapidl yBuild in a world-class developer environment by applying clean coding principles, code architecture, etc .Responsibilitie sLead design and delivery of complex end-to-end features across frontend, backend, and data layers .Make strategic architectural decisions on frameworks, datastores, and performance patterns .Review and approve pull requests, enforcing clean-code guidelines, SOLID principles, and design patterns .Build and maintain shared UI component libraries and backend service frameworks for team reuse .Identify and eliminate performance bottlenecks in both browser rendering and server throughput .Instrument services with metrics and logging, driving SLIs, SLAs, and observability .Define and enforce comprehensive testing strategies: unit, integration, and end-to-end .Own CI/CD pipelines, automating builds, deployments, and rollback procedures .Ensure OWASP Top-10 mitigations, WCAG accessibility, and SEO best practices .Partner with Product, UX, and Ops to translate business objectives into technical roadmaps .Facilitate sprint planning, estimation, and retrospectives for predictable deliveries .Mentor and guide SDE-1s and interns; participate in hiring .Qualifications & Skill s3–5 years building production Full stack applications end-to-end with measurable impact .Proven leadership in Agile/Scrum environments with a passion for continuous learning .Deep expertise in React (or Angular/Vue) with TypeScript and modern CSS methodologies .Proficient in Node.js (Express/NestJS) or Python (Django/Flask/FastAPI) or Java (Spring Boot) .Expert in designing RESTful and GraphQL APIs and scalable database schemas .Knowledge of MySQL/PostgreSQL indexing, NoSQL (ElasticSearch/DynamoDB), and caching (Redis) .Knowledge of Containerization (Docker) and commonly used AWS services such as lambda, ec2, s3, api gateway etc .Skilled in unit/integration (Jest, pytest) and E2E testing (Cypress, Playwright) .Frontend profiling (Lighthouse) and backend tracing for performance tuning .Secure coding: OAuth2/JWT, XSS/CSRF protection, and familiarity with compliance regimes .Strong communicator able to convey technical trade-offs to non-technical stakeholders .Experience in reviewing pull requests and providing constructive feedback to the team .Qualities we'd love to find in you : The attitude to always strive for the best outcomes and an enthusiasm to deliver high quality softwa reStrong collaboration abilities and a flexible & friendly approach to working with tea msStrong determination with a constant eye on solutio nsCreative ideas with problem solving mind-s etBe open to receiving objective criticism and improving upon itEagerness to learn and zeal to gr owStrong communication skills is a huge pl usWork Location : Hyderab ad About Nxt WaveNxtWave is one of India’s fastest-growing ed-tech startups, revolutionizing the 21st-century job market. NxtWave is transforming youth into highly skilled tech professionals through its CCBP 4.0 programs, regardless of their educational backgro und.NxtWave is founded by Rahul Attuluri (Ex Amazon, IIIT Hyderabad), Sashank Reddy (IIT Bombay) and Anupam Pedarla (IIT Kharagpur). Supported by Orios Ventures, Better Capital, and Marquee Angels, NxtWave raised $33 million in 2023 from Greater Pacific Capi tal.As an official partner for NSDC (under the Ministry of Skill Development & Entrepreneurship, Govt. of India) and recognized by NASSCOM, NxtWave has earned a reputation for excelle nce.Some of its prestigious recognitions incl ude:Technology Pioneer 2024 by the World Economic Forum, one of only 100 startups chosen glob ally‘Startup Spotlight Award of the Year’ by T-Hub in 2023‘Best Tech Skilling EdTech Startup of the Year 2022’ by Times Business Aw ards‘The Greatest Brand in Education’ in a research-based listing by URS M ediaNxtWave Founders Anupam Pedarla and Sashank Gujjula were honoured in the 2024 Forbes India 30 Under 30 for their contributions to tech educa tionNxtWave breaks learning barriers by offering vernacular content for better comprehension and retention. NxtWave now has paid subscribers from 650+ districts across India. Its learners are hired by over 2000+ companies including Amazon, Accenture, IBM, Bank of America, TCS, Deloitte and m ore. Know more about NxtW ave: https://www.cc bp.inRead more about us in the ne ws – Economic Times | CNBC | YourStory | VCC ircle
Posted 2 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
As a Fullstack SDE - II at NxtWave, you Build applications at a scale and see them released quickly to the NxtWave learners (within weeks) Get to take ownership of the features you build and work closely with the product team Work in a great culture that continuously empowers you to grow in your career Enjoy freedom to experiment & learn from mistakes (Fail Fast, Learn Faster) NxtWave is one of the fastest growing edtech startups. Get first-hand experience in scaling the features you build as the company grows rapidly Build in a world-class developer environment by applying clean coding principles, code architecture, etc. Responsibilities Lead design and delivery of complex end-to-end features across frontend, backend, and data layers. Make strategic architectural decisions on frameworks, datastores, and performance patterns. Review and approve pull requests, enforcing clean-code guidelines, SOLID principles, and design patterns. Build and maintain shared UI component libraries and backend service frameworks for team reuse. Identify and eliminate performance bottlenecks in both browser rendering and server throughput. Instrument services with metrics and logging, driving SLIs, SLAs, and observability. Define and enforce comprehensive testing strategies: unit, integration, and end-to-end. Own CI/CD pipelines, automating builds, deployments, and rollback procedures. Ensure OWASP Top-10 mitigations, WCAG accessibility, and SEO best practices. Partner with Product, UX, and Ops to translate business objectives into technical roadmaps. Facilitate sprint planning, estimation, and retrospectives for predictable deliveries. Mentor and guide SDE-1s and interns; participate in hiring. Qualifications & Skills 3–5 years building production Full stack applications end-to-end with measurable impact. Proven leadership in Agile/Scrum environments with a passion for continuous learning. Deep expertise in React (or Angular/Vue) with TypeScript and modern CSS methodologies. Proficient in Node.js (Express/NestJS) or Python (Django/Flask/FastAPI) or Java (Spring Boot). Expert in designing RESTful and GraphQL APIs and scalable database schemas. Knowledge of MySQL/PostgreSQL indexing, NoSQL (ElasticSearch/DynamoDB), and caching (Redis). Knowledge of Containerization (Docker) and commonly used AWS services such as lambda, ec2, s3, api gateway etc. Skilled in unit/integration (Jest, pytest) and E2E testing (Cypress, Playwright). Frontend profiling (Lighthouse) and backend tracing for performance tuning. Secure coding: OAuth2/JWT, XSS/CSRF protection, and familiarity with compliance regimes. Strong communicator able to convey technical trade-offs to non-technical stakeholders. Experience in reviewing pull requests and providing constructive feedback to the team. Qualities we'd love to find in you: The attitude to always strive for the best outcomes and an enthusiasm to deliver high quality software Strong collaboration abilities and a flexible & friendly approach to working with teams Strong determination with a constant eye on solutions Creative ideas with problem solving mind-set Be open to receiving objective criticism and improving upon it Eagerness to learn and zeal to grow Strong communication skills is a huge plus Work Location: Hyderabad About NxtWave NxtWave is one of India’s fastest-growing ed-tech startups, revolutionizing the 21st-century job market. NxtWave is transforming youth into highly skilled tech professionals through its CCBP 4.0 programs, regardless of their educational background. NxtWave is founded by Rahul Attuluri (Ex Amazon, IIIT Hyderabad), Sashank Reddy (IIT Bombay) and Anupam Pedarla (IIT Kharagpur). Supported by Orios Ventures, Better Capital, and Marquee Angels, NxtWave raised $33 million in 2023 from Greater Pacific Capital. As an official partner for NSDC (under the Ministry of Skill Development & Entrepreneurship, Govt. of India) and recognized by NASSCOM, NxtWave has earned a reputation for excellence. Some of its prestigious recognitions include: Technology Pioneer 2024 by the World Economic Forum, one of only 100 startups chosen globally ‘Startup Spotlight Award of the Year’ by T-Hub in 2023 ‘Best Tech Skilling EdTech Startup of the Year 2022’ by Times Business Awards ‘The Greatest Brand in Education’ in a research-based listing by URS Media NxtWave Founders Anupam Pedarla and Sashank Gujjula were honoured in the 2024 Forbes India 30 Under 30 for their contributions to tech education NxtWave breaks learning barriers by offering vernacular content for better comprehension and retention. NxtWave now has paid subscribers from 650+ districts across India. Its learners are hired by over 2000+ companies including Amazon, Accenture, IBM, Bank of America, TCS, Deloitte and more. Know more about NxtWave: https://www.ccbp.in Read more about us in the news – Economic Times | CNBC | YourStory | VCCircle
Posted 2 weeks ago
0 years
0 Lacs
India
Remote
Job Title: SQL Developer Intern Company: Enerzcloud Solutions Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data Engineering / Development About the Company: Enerzcloud Solutions is a forward-thinking technology company focused on delivering smart, scalable software and data solutions. We help businesses make better decisions through automation, data analysis, and cutting-edge development practices. Job Summary: We are seeking a dedicated and detail-oriented SQL Developer Intern to join our remote development team. This internship offers real-world exposure to writing SQL queries, managing databases, and supporting business intelligence and analytics processes. Key Responsibilities: Write and optimize SQL queries for data extraction and reporting Assist in designing, creating, and maintaining relational databases Perform data validation, transformation, and troubleshooting tasks Work on ETL processes and support data pipeline development Collaborate with data analysts and developers to fulfill data needs Document queries, schemas, and workflow processes Requirements: Pursuing or recently completed a degree in Computer Science, IT, or related field Strong foundational knowledge of SQL and relational databases Familiarity with MySQL, PostgreSQL, SQL Server, or similar platforms Understanding of normalization, joins, indexing, and query optimization Basic knowledge of Excel or BI tools is a plus Eager to learn and adapt in a remote work environment Perks & Benefits: ₹25,000/month stipend Real-world data and development project exposure Internship certificate upon successful completion Mentorship and learning support Flexible remote working
Posted 2 weeks ago
2.0 - 3.0 years
3 - 6 Lacs
New Delhi, Gurugram
Work from Office
Hiring For US / UK Travel BPO With Meta / PPC Call Experience Cruise/ Flight Sales Experience must Fluent English communication Open For immediate joining and rotational shift must "No other Process Experience can apply Call Shristi 7838882457 Required Candidate profile Below Mentioned Current profile and Salary Brackets Customer Support- 30 to 45 k Sales - 40 to 65 k ( PPC/Meta/ Cruise) SEO - 30 k QA - upto 35 k Perks and benefits Both side transport Meal incentive
Posted 2 weeks ago
1.0 - 3.0 years
2 - 3 Lacs
Gurugram
Work from Office
Hiring SEO From US Travel BPO experience Minimum for 06 Month's. Candidates only with relevant experience can apply Rotational shift Salary upto 35 k Cab + meal+ pf Only immediate joiners can call or WhatsApp updated resume Shristi @7838882457
Posted 2 weeks ago
10.0 - 31.0 years
15 - 17 Lacs
Gurgaon/Gurugram
On-site
Job Title: Database Architect Role Purpose The Database Architect is responsible for defining and implementing high-level database strategies aligned with enterprise business objectives. This includes architecting scalable, secure, and sustainable database systems, ensuring efficient access and performance while driving innovation in database technologies. Key Responsibilities Strategic Planning & Architecture Define strategic database requirements and develop architectural strategies at the modeling, design, and implementation stages to meet enterprise-wide needs. Design scalable database systems capable of handling high transaction loads and supporting data growth beyond 60 TB. Database Design & Development Create robust database solutions that ensure system reliability, including physical structure, functional capabilities, performance, security, backup, and recovery protocols. Design efficient database applications, including data transfer mechanisms, temporary tables, partitions, and indexing strategies to optimize performance. Installation & Maintenance Install and configure database systems using optimal access techniques, while maintaining detailed documentation of installation actions and configurations. Monitor and maintain system performance, troubleshoot production and development issues, and perform necessary maintenance activities. Performance Tuning & Monitoring Analyze system resource utilization and optimize parameters to enhance performance. Develop and implement processes for database load balancing, system upgrades, and migrations with minimal downtime. Collaboration & Governance Collaborate with system architects, software engineers, and stakeholders to translate business needs into technical database requirements. Ensure compliance with database development standards and enforce regular process documentation aligned with internal policies. Data Recovery & Security Establish and maintain high-availability clusters, disaster recovery procedures, and secure access controls. Monitor system consumption trends to ensure uptime and scalability, recommending hardware enhancements when needed. Innovation & Best Practices Research and introduce innovative technologies to future-proof database systems. Develop and enforce best practices for data migrations, upgrades, and integration with enterprise architecture. Key Skills & Competencies Proven expertise in database architecture, optimization, and performance tuning for large-scale systems. Deep understanding of relational and non-relational databases (e.g., Oracle, SQL Server, PostgreSQL, MongoDB, etc.). Strong knowledge of data modeling, indexing strategies, backup/recovery methods, and high availability architectures. Excellent problem-solving and troubleshooting skills. Strong communication and stakeholder management capabilities. Experience with database migration, version upgrades, and system integration in a complex enterprise environment. Qualifications & Experience Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. 8–12 years of experience in database design and administration, with at least 3–5 years in a lead or architect role. Experience with enterprise-scale systems exceeding 60 TB in size. Hands-on experience with clustering, backup/restore strategies, and automation tools.
Posted 2 weeks ago
2.0 - 31.0 years
2 - 3 Lacs
Meerut
On-site
Dear candidate, We are looking for Bath Sales Associate for one of our renowned client for Meerut locations. Below are the details: Job Title: Bath Associate Job Purpose: Sales of various products of Bath division such as faucets, sanitaryware, accessories etc. in a specified geography. Main Responsibilities: Achieving a target of Rs. 500,000 of secondary sales per months. Meeting retail consumers, customer’s likes plumbers, contractors, architects in field and walk-in’s customer at store. Explaining product Features, Advantages & Benefit with demonstration wherever necessary to sell the products. Creating demand for the product at the consumer level and directing the consumers, influencers like plumber, contractor & architects to the AP Home store. Completing the sales process by ensuring billing to the end consumers. Attending to consumer’s complaints in use of the products and suggesting the remedial measures. Collecting information regarding opportunities for sale such as construction activity. 70% time in the field catering to customers, APH Store walk-ins, architects, contractors and 30% time in the store to attend to customers. Reporting – For generating leads, maintaining a pipeline and daily work plan - reporting will be to the APH SSO For business generation through the leads – reporting will be to Bath SH. Updating consumers, site details regularly in LEAD App for the visibility on indexing and business potential generated. Skills Required: Excellent communication and people skills Qualifications: Essential MBA – Sales & Marketing Previous Experience: Essential - Sales experience of minimum 2- 3 years Preferred - Having worked in a market development role/sales role in a similar industry like building material, plumbing or bath fittings
Posted 2 weeks ago
7.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Project Role : Data & Document Mmgt Processor Project Role Description : Perform end to end document management services according to service level agreements. This includes data digitization, data indexing, document scanning and maintenance etc. Support initiatives with a focus on continuous improvement. Must have skills : Business Requirements Analysis Good to have skills : AWS Architecture Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data and Document Management Processor, you will engage in a variety of tasks that ensure the effective management of documents and data. Your typical day will involve performing end-to-end document management services, which include data digitization, indexing, scanning, and maintenance of documents. You will also support initiatives aimed at continuous improvement, ensuring that all processes align with established service level agreements. Collaboration with various teams will be essential to enhance operational efficiency and drive improvements in document management practices. • 7+ Years of experience • Essential skills are : • process modelling, excellent stakeholder management (across business and technical) and solution thought leadership with the ability to translate the technical into business and vice versa. • Experience in capital mkt Desirable experience in Agile ways of working -Core BA Skills – requirement elicitation, impact analysis, requirement documentation, user stories creatio, DOD, Working with PO finalizing PB, test support, business readiness along with -JIRA + Confluence know-how -Agile methodology experience -Soft skills – business and stakeholder management -Process flow – conversant with Visio or draw.io -MS Office – proficient with excel, power point and word Additional Information: - The candidate should have minimum 7.5 years of experience in Business Requirements Analysis. - This position is based at our Bengaluru office. - A 15 years full time education is required.
Posted 2 weeks ago
5.0 years
0 Lacs
India
On-site
Staff Software Engineer, Data Ingestion The Staff Software Engineer, Data Ingestion will be a critical individual contributor responsible for designing collection strategies, developing, and maintaining robust and scalable data pipelines. This role is at the heart of our data ecosystem, deliver new analytical software solution to access timely, accurate, and complete data for insights, products, and operational efficiency. Key Responsibilities: Design, develop, and maintain high-performance, fault-tolerant data ingestion pipelines using Python. Integrate with diverse data sources (databases, APIs, streaming platforms, cloud storage, etc.). Implement data transformation and cleansing logic during ingestion to ensure data quality. Monitor and troubleshoot data ingestion pipelines, identifying and resolving issues promptly. Collaborate with database engineers to optimize data models for fast consumption. Evaluate and propose new technologies or frameworks to improve ingestion efficiency and reliability. Develop and implement self-healing mechanisms for data pipelines to ensure continuity. Define and uphold SLAs and SLOs for data freshness, completeness, and availability. Participate in on-call rotation as needed for critical data pipeline issues Key Skills: 5+ years of experience, ideally with background in computer science, working in software product companies. Extensive Python Expertise: Extensive experience in developing robust, production-grade applications with Python. Data Collection & Integration: Proven experience collecting data from various sources (REST APIs, OAuth, GraphQL, Kafka, S3, SFTP, etc.). Distributed Systems & Scalability: Strong understanding of distributed systems concepts, designing for scale, performance optimization, and fault tolerance. Cloud Platforms: Experience with major cloud providers (AWS or GCP) and their data-related services (e.g., S3, EC2, Lambda, SQS, Kafka, Cloud Storage, GKE). Database Fundamentals: Solid understanding of relational databases (SQL, schema design, indexing, query optimization). OLAP database experience is a plus (Hadoop) Monitoring & Alerting: Experience with monitoring tools (e.g., Prometheus, Grafana) and setting up effective alerts. Version Control: Proficiency with Git. Containerization (Plus): Experience with Docker and Kubernetes. Streaming Technologies (Plus): Experience with real-time data processing using Kafka, Flink, Spark Streaming.
Posted 2 weeks ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description What We Do At Goldman Sachs, our Engineers don’t just make things – we make things possible. Change the world by connecting people and capital with ideas. Solve the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning alongside financial engineering to continuously turn data into action. Create new businesses, transform finance, and explore a world of opportunity at the speed of markets. Engineering, which is comprised of our Technology Division and global strategists groups, is at the critical centre of our business, and our dynamic environment requires innovative strategic thinking and immediate, real solutions. Want to push the limit of digital possibilities? Start here. Who We Look For Goldman Sachs Engineers are innovators and problem-solvers, building solutions in risk management, big data, mobile and more. We look for creative collaborators who evolve, adapt to change and thrive in a fast-paced global environment. About Data Engineering SRE Data plays a critical role in every facet of the Goldman Sachs business. The Data Engineering group is at the core of that offering, focusing on providing the platform, processes, and governance, for enabling the availability of clean, organized, and impactful data to scale, streamline, and empower our core businesses. Within Data Engineering, we run and operate some of Goldmans Sachs largest platforms, our clients are engineers and analyst across all business units that depend on our platforms for daily business deliverables. As a Site Reliability Engineer (SRE) on the Data Engineering team, you will be responsible for observability, cost and capacity with operational accountability for some of Goldman Sachs’s largest data platforms. We are engaged in the full lifecycle of platforms from design to demise with an adapted SRE strategy to the lifecycle. Who are we Looking for You have a background as a developer and can express yourself in code. You have a focus on Reliability, Observability, Capacity Management, DevOps and SDLC (Software Development Lifecycle). You are a self-leader that is comfortable taking on problem statements with n-degrees of freedom and structure them into data driven deliverables. You drive strategy with “skin in the game”, you are on the rota with the team, you drive Postmortems and you have an attitude that the problem stops with you. How You Will Fulfil Your Potential Drive adoption of cloud technology for data processing and warehousing You will drive SRE strategy for some of GS largest platforms including Lakehouse and Data Lake Engage with data consumers and producers to match reliability and cost requirements You will drive strategy with data Relevant Technologies: Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, Gitlab Basic Qualifications A Bachelor or Masters degree in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline) 1-4+ years of relevant work experience in a team-focused environment 1-2 years hands on developer experience at some point in career Understanding and experience of DevOps and SRE principles and automation, managing technical and operational risk Experience with cloud infrastructure (AWS, Azure, or GCP) Proven experience in driving strategy with data Deep understanding of multi-dimensionality of data, data curation and data quality, such as traceability, security, performance latency and correctness across supply and demand processes In-depth knowledge of relational and columnar SQL databases, including database design Expertise in data warehousing concepts (e.g. star schema, entitlement implementations, SQL v/s NoSQL modelling, milestoning, indexing, partitioning) Excellent communications skills and the ability to work with subject matter experts to extract critical business concepts Independent thinker, willing to engage, challenge or learn Ability to stay commercially focused and to always push for quantifiable commercial impact Strong work ethic, a sense of ownership and urgency Strong analytical and problem-solving skills Ability to build trusted partnerships with key contacts and users across business and engineering teams Preferred Qualifications Understanding of Data Lake / Lakehouse technologies incl. Apache Iceberg Experience with cloud databases (e.g. Snowflake, Big Query) Understanding concepts of data modelling Working knowledge of open-source tools such as AWS lambda, Prometheus Experience coding in Java or Python
Posted 2 weeks ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Oracle Data Integrator (ODI) Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to c onduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms. Socialize solution design and enable knowledge transfer. Drive train-the trainer sessions to drive adoption of ODI. Partner with clients to drive outcome and deliver value. Collaborate with cross functional teams. Understand source applications and how it can be integrated. Analyze data sets to understand functional and business context. Create Data Warehousing data model and integration design. Understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) Communicate development status and risks to key stakeholders. Lead the team to design, build, test and deploy. Support client needs by delivering ODI jobs and frameworks. Merge, Customize and Deploy ODI data model as per client business requirements. Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients. Focus on designing, building, and documenting re-usable code artifacts. Track, report and optimize ODI jobs performance to meet client SLA. Designing and architecting ODI projects including upgrade/migrations to cloud. Design and implement security in ODI. Identify risks and suggest mitigation plan. Ability to lead the team and mentor junior practitioners. Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts. Perform system analysis, follow technical design and work on development activities. Participate in design meetings, daily standups, backlog grooming. Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities. Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans. Validate the environment to meets all security and compliance controls. Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders. Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions. Skills: Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files. Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premises to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 6+ years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability. Experience in defining new architectures and ability to drive project from architecture standpoint. Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills. Strong communicator Willingness to travel in case of project requirement. Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers. How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our community. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302894
Posted 2 weeks ago
6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Oracle Data Integrator (ODI) Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to c onduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms. Socialize solution design and enable knowledge transfer. Drive train-the trainer sessions to drive adoption of ODI. Partner with clients to drive outcome and deliver value. Collaborate with cross functional teams. Understand source applications and how it can be integrated. Analyze data sets to understand functional and business context. Create Data Warehousing data model and integration design. Understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) Communicate development status and risks to key stakeholders. Lead the team to design, build, test and deploy. Support client needs by delivering ODI jobs and frameworks. Merge, Customize and Deploy ODI data model as per client business requirements. Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients. Focus on designing, building, and documenting re-usable code artifacts. Track, report and optimize ODI jobs performance to meet client SLA. Designing and architecting ODI projects including upgrade/migrations to cloud. Design and implement security in ODI. Identify risks and suggest mitigation plan. Ability to lead the team and mentor junior practitioners. Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts. Perform system analysis, follow technical design and work on development activities. Participate in design meetings, daily standups, backlog grooming. Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities. Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans. Validate the environment to meets all security and compliance controls. Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders. Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions. Skills: Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files. Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premises to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 6+ years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability. Experience in defining new architectures and ability to drive project from architecture standpoint. Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills. Strong communicator Willingness to travel in case of project requirement. Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers. How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our community. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302894
Posted 2 weeks ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Job Description: As an ELK (Elastic, Logstash & Kibana) Data Engineer, you would be responsible for developing, implementing, and maintaining the ELK stack-based solutions for Kyndryl’ s clients. This role would be responsible to develop efficient and effective, data & log ingestion, processing, indexing, and visualization for monitoring, troubleshooting, and analysis purposes. Key Responsibilities : Configure Logstash to receive, filter, and transform logs from diverse sources (e.g., servers, applications, AppDynamics, Storage, Databases and so son) before sending them to Elasticsearch. Configure ILM policies, Index templates etc. Develop Logstash configuration files to parse, enrich, and filter log data from various input sources (e.g., APM tools, Database, Storage and so on) Implement techniques like grok patterns, regular expressions, and plugins to handle complex log formats and structures. Ensure efficient and reliable data ingestion by optimizing Logstash performance, handling high data volumes, and managing throughput. Utilize Kibana to create visually appealing dashboards, reports, and custom visualizations. Collaborate with business users to understand their data integration & visualization needs and translate them into technical solutions Establishing the correlation within the data and develop visualizations to detect the root cause of the issue. Integration with ticketing tools such as Service Now Hands on with ML and Watcher functionalities Monitor Elasticsearch clusters for health, performance, and resource utilization Create and maintain technical documentation, including system diagrams, deployment procedures, and troubleshooting guides Who You Are Education, Experience, and Certification Requirements: BS or MS degree in Computer Science or a related technical field 5+ years overall IT Industry Experience. 3+ years of development experience with Elastic, Logstash and Kibana in designing, building, and maintaining log & data processing systems 3+ years of Python or Java development experience 4+ years of SQL experience (No-SQL experience is a plus) 4+ years of experience with schema design and dimensional data modelling Experience working with Machine Learning model is a plus Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes) is a plus “Elastic Certified Engineer” certification is preferrable Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
0.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Bengaluru, Karnataka Job ID 30185740 Job Category Digital Technology Role : SQL developer with Data modeling and AWS/Azure Location: Bangalore Full/ Part-time: Full Time. Build a career with confidence: Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. About the Role: Looking for SQL Developer with ETL background and AWS OR Azure cloud platform experience. Job Description: Design, develop, and implement scalable and efficient data warehouse solutions on cloud platforms using Azure Fabric, AWS Redshift etc, Create and optimize data models to support business reporting and analytical needs. Integration using ETL Tools like Azure Data Factory etc. Write complex SQL queries, stored procedures, and functions for data manipulation and analysis. Implement data quality checks and validation processes to ensure data accuracy and integrity. Monitor and optimize data warehouse performance, including query tuning, indexing, and data partitioning strategies. Identify and troubleshoot data-related issues, ensuring data availability and reliability. Collaborate with data architects, data engineers, business analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Analytical Skills: Strong problem-solving, analytical, and critical thinking skills. Preferred Skills & Tools for this role are: Experience of 7 to 10 years in the below mentioned skill sets Cloud Platforms: Azure (Data Factory, Azure Fabric, SQL DB, Data Lake), AWS (RedShift)—Any Azure tools OR AWS Databases: Postgres SQL OR MSSQL ETL Tools: Azure Data Factory OR Any ETL Tool Experience.- Languages: Expert level proficiency in T-SQL, Python.—TSQL AND PYTHON BI Tools: Power BI or similar—POWERBI OR TABLEAU OR SPOTFIRE Version Control & DevOps: Azure DevOps, Git.—any of these is preferred Benefits: We are committed to offering competitive benefits programs for all our employees and enhancing our programs when necessary. Make yourself a priority with flexible schedules, parental leave. Drive forward your career through professional development opportunities. Achieve your personal goals with our Employee Assistance Programme. Our commitment to you: Our greatest assets are the expertise, creativity, and passion of our employees. We strive to provide a great place to work that attracts, develops, and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback, and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Posted 2 weeks ago
1.0 - 2.0 years
3 - 6 Lacs
New Delhi, Gurugram
Work from Office
Hiring For US / UK Travel BPO With Meta / PPC Call Experience Cruise/ Flight Sales Experience must Fluent English communication Open For immediate joining and rotational shift must "No other Process Experience can apply Call Shristi 7838882457 Required Candidate profile Below Mentioned Current profile and Salary Brackets Customer Support- 30 to 45 k Sales - 40 to 65 k ( PPC/Meta/ Cruise) SEO - 30 k QA - upto 35 k Perks and benefits Both side transport Meal incentive
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Must have : Strong on programming languages like Python, Java One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc. Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing , Late arrival , Triggers etc. Good To Have Hyperparameter tuning experience. Proficient in either Apache Spark or Apache Beam or Apache Flink. Should have hands-on experience on Distributed computing. Should have working experience on Data Architecture design. Should be aware of storage and compute options and when to choose what. Should have good understanding on Cluster Optimisation/ Pipeline Optimisation strategies. Should have exposure on GCP tools to develop end to end data pipeline for various scenarios (including ingesting data from traditional data bases as well as integration of API based data sources). Should have Business mindset to understand data and how it will be used for BI and Analytics purposes. Should have working experience on CI/CD pipelines, Deployment methodologies, Infrastructure as a code (eg. Terraform). Hands-on experience on Kubernetes. Vector based Database like Qdrant. LLM experience (embeddings generation, embeddings indexing, RAG, Agents, etc.). Key Responsibilities Design, develop, and implement AI models and algorithms using Python and Large Language Models (LLMs). Collaborate with data scientists, engineers, and business stakeholders to define project requirements and deliver impactful AI-driven solutions. Optimize and manage data pipelines, ensuring efficient data storage and retrieval with PostgreSQL. Continuously research emerging AI trends and best practices to enhance model performance and capabilities. Deploy, monitor, and maintain AI applications in production environments, adhering to best industry standards. Document technical designs, workflows, and processes to facilitate clear knowledge transfer and project continuity. Communicate technical concepts effectively to both technical and non-technical team Skills and Qualifications : Proven expertise in Python programming for AI/ML applicati (ref:hirist.tech)
Posted 2 weeks ago
7.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title : Senior Python Developer Key Responsibilities Software Development : Design, develop, test, and deploy high-quality Python applications and services. API Development : Build and maintain robust and scalable APIs using frameworks like FastAPI or Flask. Database Management : Design database schemas, write complex SQL queries, and optimize database performance for PostgreSQL. System Design : Contribute to the architectural design of new features and systems, ensuring scalability, reliability, and maintainability. Containerization & Orchestration : Implement and manage applications within containerized environments using Docker and orchestrate deployments with Kubernetes. CI/CD Implementation : Work with CI/CD pipelines to ensure automated testing, deployment, and continuous integration. Troubleshooting & Debugging : Identify, diagnose, and resolve complex technical issues in production and development environments. Code Quality : Ensure code quality through rigorous testing, code reviews, and adherence to best practices. Project Ownership : Take ownership of projects, driving them independently from conception to successful deployment and maintenance. Collaboration : Collaborate effectively with cross-functional teams, including product managers, other engineers, and QA. Required Skills & Experience Python Expertise : 7+ years of professional experience in Python development, with a strong understanding of Pythonic principles and best practices. Web Frameworks : Strong experience with FastAPI (or Flask, with a willingness to quickly adapt and switch to FastAPI). Database Proficiency : Proficiency in PostgreSQL, including advanced SQL querying, database design, indexing strategies, and performance tuning. Containerization & Orchestration : Solid understanding and hands-on experience with Kubernetes for container orchestration and microservices deployment. Development Tools : Experience with Docker for containerization, Git for version control, and implementing/managing CI/CD pipelines (e.g., Jenkins, GitLab CI/CD, GitHub Actions). Data Structures & Algorithms : Strong background in data structures, algorithms, and their practical application in solving complex problems. System Design : Proven ability in designing scalable, resilient, and performant software systems. Independent Work : Demonstrated ability to work independently, take initiative, and drive projects end-to-end with minimal supervision. Communication : Good communication skills, both written and verbal, with the ability to articulate technical concepts clearly and concisely. Education & Certifications Bachelor's degree in Computer Science, Software Engineering, or a related technical field. Master's degree is a plus. Relevant certifications in Python, cloud platforms, or container technologies are a plus. (ref:hirist.tech)
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a strong algorithm and logic building capability along with the ability to prototype rapidly. You must be fluent in MSSQL and have a deep understanding of Entity-Relationship concepts, Normalization/Denormalization, indexing, and performance monitoring and tuning. Your role will involve writing optimized, effective, reusable, and scalable code, analyzing existing SQL queries for performance improvements, and testing and debugging with refactoring capabilities. Implementing security and data protection solutions, knowledge in RESTful API and microservice environment, understanding of AGILE, and creating technical documentation are also key responsibilities. Additionally, you should possess soft skills to work in a team environment and excel in a startup environment with a high level of ownership and commitment. Writing unit and integration test cases is also expected. As for the qualifications, you should hold a Bachelors degree in EE, CS, ME, or equivalent, with a minimum of 2+ years of experience. Demonstrated strong written and verbal communication skills are necessary. Hands-on experience with MSSQL is a must, along with experience in one of AWS, GCP, Azure Cloud. Some understanding of building scalable and reliable products in the cloud, ability to prioritize end-to-end, debug, and develop modular code, and thinking outside the box to discover innovative solutions for complex data management issues are also required.,
Posted 2 weeks ago
7.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Project Role : Data & Document Mmgt Processor Project Role Description : Perform end to end document management services according to service level agreements. This includes data digitization, data indexing, document scanning and maintenance etc. Support initiatives with a focus on continuous improvement. Must have skills : Business Requirements Analysis Good to have skills : AWS Architecture Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data and Document Management Processor, you will engage in a variety of tasks that ensure the effective management of documents and data. Your typical day will involve performing end-to-end document management services, adhering to service level agreements. This includes activities such as data digitization, data indexing, document scanning, and maintenance. You will also support initiatives aimed at continuous improvement, collaborating with various teams to enhance processes and outcomes. JD :: • 7+ Years of experience • Essential skills are : • process modelling, excellent stakeholder management (across business and technical) and solution thought leadership with the ability to translate the technical into business and vice versa. • Experience in capital mkt Desirable experience in Agile ways of working -Core BA Skills – requirement elicitation, impact analysis, requirement documentation, user stories creatio, DOD, Working with PO finalizing PB, test support, business readiness along with -JIRA + Confluence know-how -Agile methodology experience -Soft skills – business and stakeholder management -Process flow – conversant with Visio or draw.io -MS Office – proficient with excel, power point and word Additional Information: - The candidate should have minimum 7.5 years of experience in Business Requirements Analysis. - This position is based at our Bengaluru office. - A 15 years full time education is required.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi