Home
Jobs

8 Sql. Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

6 - 11 Lacs

Bengaluru, Karnataka, India

On-site

As part of the offshore development team, the AWS Developers will be responsible for implementing ingestion and transformation pipelines using PySpark, orchestrating jobs via MWAA, and converting legacy Cloudera jobs to AWS-native services. Key Responsibilities: Write ingestion scripts (batch & stream) to migrate data from on-prem to S3. Translate existing HiveQL into SparkSQL/PySpark jobs. Configure MWAA DAGs to orchestrate job dependencies. Build Iceberg tables with appropriate partitioning and metadata handling. Validate job outputs and write unit tests. Required Skills: 35 years in data engineering, with strong exposure to AWS. Experience in EMR (Spark), S3, PySpark, SQL. Working knowledge of Cloudera/HDFS and legacy Hadoop pipelines. Prior experience with data lake/lakehouse implementations is a plus Mandatory Skills AWS Developer

Posted 9 hours ago

Apply

4.0 - 8.0 years

6 - 11 Lacs

Pune, Maharashtra, India

On-site

As part of the offshore development team, the AWS Developers will be responsible for implementing ingestion and transformation pipelines using PySpark, orchestrating jobs via MWAA, and converting legacy Cloudera jobs to AWS-native services. Key Responsibilities: Write ingestion scripts (batch & stream) to migrate data from on-prem to S3. Translate existing HiveQL into SparkSQL/PySpark jobs. Configure MWAA DAGs to orchestrate job dependencies. Build Iceberg tables with appropriate partitioning and metadata handling. Validate job outputs and write unit tests. Required Skills: 35 years in data engineering, with strong exposure to AWS. Experience in EMR (Spark), S3, PySpark, SQL. Working knowledge of Cloudera/HDFS and legacy Hadoop pipelines. Prior experience with data lake/lakehouse implementations is a plus Mandatory Skills AWS Developer

Posted 9 hours ago

Apply

4.0 - 8.0 years

6 - 11 Lacs

Hyderabad, Telangana, India

On-site

As part of the offshore development team, the AWS Developers will be responsible for implementing ingestion and transformation pipelines using PySpark, orchestrating jobs via MWAA, and converting legacy Cloudera jobs to AWS-native services. Key Responsibilities: Write ingestion scripts (batch & stream) to migrate data from on-prem to S3. Translate existing HiveQL into SparkSQL/PySpark jobs. Configure MWAA DAGs to orchestrate job dependencies. Build Iceberg tables with appropriate partitioning and metadata handling. Validate job outputs and write unit tests. Required Skills: 35 years in data engineering, with strong exposure to AWS. Experience in EMR (Spark), S3, PySpark, SQL. Working knowledge of Cloudera/HDFS and legacy Hadoop pipelines. Prior experience with data lake/lakehouse implementations is a plus Mandatory Skills AWS Developer

Posted 9 hours ago

Apply

4.0 - 8.0 years

4 - 9 Lacs

Mumbai

Hybrid

Role- AWS DevOps Engineer Location: Mumbai, India Experience: 4-8 Years Role Overview- We are seeking a skilled AWS DevOps Engineer with 4-8 years of experience to join our team in Mumbai. The ideal candidate will have a strong background in DevOps practices, extensive experience with Terraform, and proficiency in managing AWS infrastructure. You will play a pivotal role in designing, implementing, and managing robust and scalable cloud solutions while ensuring seamless CI/CD processes and infrastructure automation. Key Responsibilities- • Design, deploy, and manage scalable, secure, and highly available AWS environments. • Automate infrastructure provisioning, configuration, and deployment using Terraform. • Implement and optimize CI/CD pipelines to support efficient code deployment processes. • Collaborate with development teams to ensure seamless integration of DevOps practices. • Monitor system performance, troubleshoot issues, and ensure high uptime. • Manage and optimize AWS services, including EC2, S3, RDS, Lambda, CloudFormation, and more. • Implement security best practices in cloud infrastructure and applications. • Develop scripts and tools to enhance automation processes (e.g., Bash, Python). • Leverage Infrastructure as Code (IaC) principles to manage cloud resources. • Work closely with cross-functional teams to align infrastructure with application requirements. • Maintain clear documentation of system architecture, workflows, and procedures. Key Skills and Qualifications- • Strong hands-on experience with AWS cloud services. • Proficiency in Terraform for Infrastructure as Code (IaC). • Sound knowledge of DevOps tools and technologies like Jenkins, Git, Docker, and Kubernetes. • Experience with CI/CD pipeline creation and management. • Scripting experience (e.g., Python, Shell, or Bash). Experience- • 4-8 years of relevant experience in DevOps, AWS cloud management, and infrastructure automation. • Demonstrated ability to troubleshoot and resolve complex system issues. • Excellent communication and teamwork skills. • Ability to handle multiple tasks and prioritize effectively in a fast-paced environment. • A proactive attitude toward learning and problem-solving. Preferred. Qualifications- • AWS certifications (e.g., AWS Certified DevOps Engineer, Solutions Architect). • Experience with monitoring tools like Prometheus, Grafana, or CloudWatch. • Exposure to container orchestration tools like Kubernetes or ECS. Why Join Us? • Be part of a dynamic team working on cutting-edge technologies. • Opportunity to enhance your technical skills in a supportive environment. • Competitive salary and benefits. • Hybrid working model based in Mumbai. Application Process- If you are passionate about cloud technologies and DevOps practices, and you thrive in challenging environments, we would love to hear from you. Apply Now!

Posted 3 days ago

Apply

2.0 - 4.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Key Skills: AWS, GCP, ETL, Azure, SQL. Roles and Responsibilities: Diagnose and resolve intermediate to complex technical issues related to SQL query performance, data access, and system configuration. Communicate technical solutions clearly and effectively to customers while collaborating with product and engineering teams. Participate in on-call rotation to provide incident response and remote issue resolution for critical cases. Create and maintain comprehensive technical documentation and knowledge base articles based on resolved support cases. Proactively analyze recurring customer issues to recommend improvements to product features, engineering practices, and support processes. Optimize customer environments by understanding and anticipating technical needs to drive platform adoption and customer satisfaction. Experience Requirement: 2 - 4 years of experience in a technical support or engineering role, preferably supporting database systems, query engines, or data processing platforms. Strong expertise in SQL with demonstrated ability to troubleshoot and optimize queries. Hands-on experience with cloud platforms (AWS, Azure, or GCP), especially in areas of infrastructure, containers, networking, and security. Familiarity with data lake architecture and technologies. Understanding of data pipeline concepts and common ETL workflows is preferred. Education: Any Post Graduation, Any Graduation.

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 30 Lacs

Chennai, Bengaluru

Work from Office

Key Skills: Oracle, ODI, FDIP, SQL. Roles and Responsibilities: Design and Development: Design, develop, and implement complex analytics solutions using Oracle Analytics Cloud (OAC) and Fusion Data Intelligence Platform (FDIP). Data Preparation and Transformation: Prepare and transform data from various sources for analysis. Visualization and Reporting: Create dashboards, reports, and visualizations to present data insights to stakeholders. AI/ML Integration: Collaborate with AI and machine learning models for advanced analytical capabilities where applicable. Semantic Model Configuration: Configure and deploy semantic models within FDIP to support business requirements. Data Validation and Reconciliation: Leverage FDIP features to validate and reconcile data for accuracy and consistency. Performance Optimization: Optimize data pipelines and analytical models for improved performance and scalability. Experience Requirements: Proven expertise in Oracle Analytics Cloud (OAC) and Fusion Data Intelligence Platform (FDIP), with a deep understanding of their features and functionalities. Strong experience in data modeling and semantic model development. Proficient in SQL and PL/SQL for efficient data querying and transformation. Hands-on experience with Oracle Fusion Cloud Applications is highly desirable. Skilled in building dashboards and reports using OAC visualization tools. Solid understanding of data governance principles and practices. Excellent communication and collaboration skills to effectively engage with stakeholders. Strong problem-solving abilities to troubleshoot and debug issues within OAC and FDIP environments. Education: Any Graduation.

Posted 1 week ago

Apply

1.0 - 6.0 years

1 - 6 Lacs

Pune, Maharashtra, India

On-site

We are looking for a Software Engineer II who is passionate about quality and has a strong background in functional testing, with an eye towards automation. In this role, you'll be a key member of our scrum team, working closely with developers and domain experts to ensure the delivery of high-quality products. You'll be responsible for translating complex requirements into comprehensive test scenarios, identifying and developing automation scripts, and proactively monitoring product quality throughout the development lifecycle. Role: As a Software Engineer II, you will: Primarily work as a manual functional tester , while proactively identifying opportunities for automation and contributing to their implementation. Translate complex system requirements and specifications into clear test requirements and testing methods . Participate in requirements review and testing activities, monitor resolutions, and maintain thorough documentation. Adhere to QA standards, processes, tools, and methodologies , partnering with other functions to gather testing requirements. Be a vital part of the scrum team , actively participating in requirements review/story elaborations and testing activities to deliver high-quality products. Translate high-level business requirements into comprehensive test scenarios to cover integration flows and customer journeys. Collaborate with software developers and domain experts in designing, performing, and improving verification tests . Identify automation needs and develop test scripts . Work collaboratively and effectively in a very fast-paced environment . Identify early defects to improve the quality of the products . Follow the SDLC and STLC process with quality management and Agile tools like ALM. Proactively monitor customer insights and production issues to gather quality feedback and improve processes to enhance the quality of the product/capability. Play a crucial part in helping drive Quality to help build and ship better products . Qualifications : Working knowledge of Card Payment systems. Working knowledge of payment simulation tools such as T3/similar or and ISO payment protocols (8583) & (20022). Understanding of Unix Commands, SQL . Understanding of any programming language . Knowledge of software testing life cycle [Test planning, Test design and execution, Defect Management, Test Reporting]. Hands-on experience with Functional, Regression, System, and UAT Testing . Experience on Tandem HP NonStop, Mainframe will be an added advantage. Any experience with TDD and/or BDD will be an added advantage. Experience in automation testing using any of the tools like JBehave, TestNG, SOAP UI, Appium, Selenium, mobile automation tools will be an added advantage. Excellent defect finding, debugging, and root cause analysis capabilities . Excellent communication skills. Experience of testing solutions for large-scale deployments, including large enterprises or service providers, banking, or payment solutions. Experience of working in the payments application domain . Hands-on experience of working with tools like Confluence, JIRA, and Rally . Good at test strategy analysis systematically and delivery-focused . Strong organizational and problem-solving skills with great attention to detail, critical thinking, and solid communication skills. Ability to be flexible, accountable, reliable, and industrious. Ability to manage multiple priority efforts in parallel and ensure Quality Assurance standards are followed. High-energy, detail-oriented, and proactive with the ability to function under pressure in an independent environment. Strong oral and written communication skills. Bachelor's Degree in Computer Science or a related field.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

20 - 27 Lacs

Pune

Hybrid

Key Skills: Scripting Language, Terraform, Jenkins, Cloud, Scripting, DevOps, Linux, UNIX, SQL. Roles and Responsibilities: Support the Product Owner in managing the workstream backlog and prioritizing stories and subtasks. Groom backlog stories to meet the 'definition of ready'. Contribute to the project RAAID (Risks, Assumptions, Actions, Issues, Dependencies) by raising items when necessary. Provide updates to the Engagement Lead and Project Management on progress against delivery milestones. Manage multiple environments effectively and ensure their stability. Collaborate with project teams to resolve technical and application issues related to project delivery. Work with Engineering and Operations teams to ensure proper environment monitoring. Ensure adherence to SLA commitments and escalate issues as needed. Conduct deployment activities and manage development, test, UAT, and production environments during the project deployment phase. Establish, document, and implement best practices for end-to-end application initiation and deployment processes. Strive for continuous improvement and enhanced customer satisfaction. Demonstrate flexibility to meet evolving project needs. Attend relevant project meetings and provide updates and insights. Coordinate technical specialists to automate setup and configuration of environments. Monitor and ensure uptime of test environments, providing ongoing support and communication of environment availability. Create short-term plans to support sprint-based development and forecast environment requirements based on anticipated future demand. Develop KPIs to measure the effectiveness of environment delivery, moving toward maximum automation and self-service models. Provide environment onboarding estimates for new projects and support continuous delivery efforts. Take ownership of assigned tasks and drive them to resolution with minimal supervision. Communicate effectively across global teams and collaborate with cross-functional stakeholders. Experience Requirements: 6 to 12 years of relevant experience. Strong experience working in Agile and DevOps environments using collaboration tools like Confluence and JIRA. Proficient in Unix and SQL administration. Strong skills in Terraform and Shell or Python scripting. Prior hands-on experience with DevOps tools such as Git, Jenkins, Ansible/Puppet, and Kubernetes. Demonstrated experience in implementing solutions on Google Cloud Platform (GCP). Understanding of Big Data concepts, DevOps principles, and container technologies. Proven multitasking ability with strong time management and prioritization skills. Experience in environment automation, continuous delivery support, and managing environment KPIs. Strong interpersonal and communication skills with a global and culturally diverse team mindset. Experience working in regulatory-driven project environments is an advantage. Education: B.Tech M.Tech (Dual), B.Tech, M. Tech.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies