Jobs
Interviews

1113 Aws Cloud Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 20.0 years

35 - 85 Lacs

Bengaluru

Hybrid

We are seeking a highly skilled and experienced Senior Engineering Manager, Development to join our team. The Senior Engineering Manager, Development will take charge of leading the development of cloud based RingCentrals Phone capabilities. The ideal candidate will have a strong background in server side software development and a proven track record of successful project leadership. This role demands a collaborative leader with a strong sense of ownership, accountability, and priority management skills, capable of building and nurturing a high-performing team. Primary Responsibilities: Lead the design, development, and automation of Server Side (Telco) components of RingCentrals Phone solution to meet business requirements and satisfy top-grade stability and quality standards. Manage a team of developers, and automation engineers, providing technical guidance and mentorship. Implement best practices for software development, data management, and system integration. This is a hands-on role with individual contributions apart from leading the team, actively participating in daily coding and code reviews. Provide technical expertise and support for Teleco-related projects and initiatives. Partner with Product Team and business stakeholders in solution design and troubleshooting of problem escalations. Manage the execution of implementation plans from inception to successful completion, ensuring transparent status reporting. Collaborate closely with cross-functional Project and Program Managers throughout all phases of the initiatives. Maintain high standards of quality in product documentation. Ensure that product documentation remains clear, comprehensive, and up-to-date. Actively participate in applicable compliance, governance and risk management frameworks. Collaborate with peers in design and architectural validations. Adhere to established security, privacy, and data management protocols in accordance with company standards. Take a proactive approach to risk identifi cation and adeptly mitigate potential challenges. Minimum Qualifications, Education and Experience: BS/MS degree or equivalent experience in the fi eld. 15+ years of experience with implementing, supporting and enhancing software products. Min 3+ years in the engineering management position Strong Hands-on experience in developing cloud based backend services using modern C++ (11, 14, 17 or later), linux, open source libraries etc. Hands-on experience on deploying code on public cloud services (AWS, Azure, GCP) using container technologies (docker, k8s). Good understanding of Networking and TCP/UDP protocols troubleshooting Comprehensive knowledge and understanding of SDLC best practices and techniques. Understanding of Agile development methodologies (Scrum, Kanban) Excellent communication, presentation, people management, project management (daily scrum, plan etc) and negotiation skills. Will be a plus Experience in developing high-load, fault-tolerant services; Good to have telecom domain experience. Knowledge of VoIP telephony systems and protocols (e.g., SIP, RTP/RTCP, etc). Experience with modern VoIP phones (Avaya, Polycom, Yealink, Cisco, Avaya, Unify, Alcatel). Familiarity with Python scripting.

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai, Maharastra

Work from Office

Grade Level (for internal use) : - 10 The Team You will be an expert contributor and part of the Rating Organizations Data Services Product Engineering Team This team, who has a broad and expert knowledge on Ratings organizations critical data domains, technology stacks and architectural patterns, fosters knowledge sharing and collaboration that results in a unified strategy All Data Services team members provide leadership, innovation, timely delivery, and the ability to articulate business value Be a part of a unique opportunity to build and evolve S&P Ratings next gen analytics platform Responsibilities: Design and implement innovative software solutions to enhance S&P Ratings' cloud-based data platforms. Mentor a team of engineers fostering a culture of trust, continuous growth, and collaborative problem-solving. Collaborate with business partners to understand requirements, ensuring technical solutions align with business goals. Manage and improve existing software solutions, ensuring high performance and scalability. Participate actively in all Agile scrum ceremonies, contributing to the continuous improvement of team processes. Produce comprehensive technical design documents and conduct technical walkthroughs. Experience & Qualifications: Bachelors degree in computer science, Information Systems, Engineering, equivalent or more is required Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development 7+ years of development experience in enterprise products, modern web development technologies Java/J2EE, UI frameworks like Angular, React, SQL, Oracle, NoSQL Databases like MongoDB Experience designing transactional/data warehouse/data lake and data integrations with Big data eco system leveraging AWS cloud technologies Exp. with Delta Lake systems like Databricks using AWS cloud technologies and PySpark is a plus Thorough understanding of distributed computing Passionate, smart, and articulate developer Quality first mindset with a strong background and experience with developing products for a global audience at scale Excellent analytical thinking, interpersonal, oral and written communication skills with strong ability to influence both IT and business partners Superior knowledge of system architecture, object-oriented design, and design patterns. Good work ethic, self-starter, and results-oriented Excellent communication skills are essential, with strong verbal and writing proficiencies Additional Preferred Qualifications: Experience working AWS Experience with SAFe Agile Framework Bachelor's/PG degree in Computer Science, Information Systems or equivalent. Hands-on experience contributing to application architecture & designs, proven software/enterprise integration design principles Ability to prioritize and manage work to critical project timelines in a fast-paced environment Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies Ability to train and mentor Benefits: Health & Wellness: Health care coverage designed for the mind and body. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: Its not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awardssmall perks can make a big difference.

Posted 1 month ago

Apply

3.0 - 8.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Job Title: Database Engineer Analytics – L Responsibilities As a Database Engineer supporting the bank’s Analytics platforms, you will be a part of a centralized team of database engineers who are responsible for the maintenance and support of Citizens’ most critical databases. A Database Engineer will be responsible for: • Requires conceptual knowledge of database practices and procedures such as DDL, DML and DCL. • Requires how to use basic SQL skills including SELECT, FROM, WHERE and ORDER BY. • Ability to code SQL Joins, subqueries, aggregate functions (AVG, SUM, COUNT), and use data manipulation techniques (UPDATE, DELETE). • Understanding basic data relationships and schemas. • Develop Basic Entity-Relationship diagrams. • Conceptual understanding of cloud computing • Can solves routine problems using existing procedures and standard practices. • Can look up error codes and open tickets with vendors • Ability to execute explains and identify poorly written queries • Review data structures to ensure they adhere to database design best practices. • Develop a comprehensive backup plan. • Understanding the different cloud models (IaaS, PaaS, SaaS), service models, and deployment options (public, private, hybrid). • Solves standard problems by analyzing possible solutions using experience, judgment and precedents. • Troubleshoot database issues, such as integrity issues, blocking/deadlocking issues, log shipping issues, connectivity issues, security issues, memory issues, disk space, etc. • Understanding cloud security concepts, including data protection, access control, and compliance. • Manages risks that are associated with the use of information technology. • Identifies, assesses, and treats risks that might affect the confidentiality, integrity, and availability of the organization's assets. • Ability to design and implement highly performing database using partitioning & indexing that meet or exceed the business requirements. • Documents a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow. • Ability to code complex SQL. • Performs effective backup management and periodic databases restoration testing. • General DB Cloud networking skills – VPCs, SGs, KMS keys, private links. JOB DESCRIPTION • Ability to develop stored procedures and at least one scripting language for reusable code and improved performance. Know how to import and export data into and out of databases using ETL tools, code, migration tools like DMS or scripts • Knowledge of DevOps principles and tools, such as CI/CD. • Attention to detail and demonstrate a customer centric approach. • Solves complex problems by taking a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information • Ability to optimize queries for performance and resource efficiency • Review database metrics to identify performance issues. Required Qualifications • 2-10+ years of experience with database management/administration, Redshift, Snowflake or Neo4J • 2-10+ years of experience working with incident, change and problem management processes and procedures. • Experience maintaining and supporting large-scale critical database systems in the cloud. • 2+ years of experience working with AWS cloud hosted databases • An understanding of one programming languages, including at least one front end framework (Angular/React/Vue), such as Python3, Java, JavaScript, Ruby, Golang, C, C++, etc. • Experience with cloud computing, ETL and streaming technologies – OpenShift, DataStage, Kafka • Experience with agile development methodology • Strong SQL performance & tuning skills • Excellent communication and client interfacing skills • Strong team collaboration skills and capacity to prioritize tasks efficiently. Desired Qualifications • Experience working in an agile development environment • Experience working in the banking industry • Experience working in cloud environments such as AWS, Azure or Google • Experience with CI/CD pipeline (Jenkins, Liquibase or equivalent) Education and Certifications • Bachelor’s degree in computer science or related discipline

Posted 1 month ago

Apply

11.0 - 21.0 years

0 Lacs

Hyderabad

Work from Office

10 years of IT engineering experience with a focus on cloud technologies. Extensive experience with AWS Cloud. Develop and manage Iac using Terraform Must have skills: Strong AWS cloud experience 5+ years Chef Terraform Kafka On Perm Solutions

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Ahmedabad, Bengaluru, Mumbai (All Areas)

Work from Office

Designing, Developing, and Delivering scalable web applications using ASP.Net Core and Angular. Monitor cloud infrastructure (AWS), implement CI/CD pipelines. Strong hands-on experience with ASP.Net Core (MVC & Web API). Required Candidate profile ASP.Net Core Angular (latest versions preferred) AWS (Amazon Web Services) MongoDB Azure

Posted 1 month ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Position Summary: We are seeking a highly skilled ETL QA Engineer with at least 6 years of experience in ETL/data pipeline testing on the AWS cloud stack , specifically with Redshift, AWS Glue, S3 , and related data integration tools. The ideal candidate should be proficient in SQL , capable of reviewing and validating stored procedures , and should have the ability to automate ETL test cases using Python or suitable automation frameworks . Strong communication skills are essential, and web application testing exposure is a plus. Technical Skills Required: SQL Expertise : Ability to write, debug, and optimize complex SQL queries. Validate data across source systems, staging areas, and reporting layers. Experience with stored procedure review and validation. ETL Testing Experience : Hands-on experience with AWS Glue , Redshift , S3 , and data pipelines. Validate transformations, data flow accuracy, and pipeline integrity. ETL Automation : Ability to automate ETL tests using Python , PyTest , or other scripting frameworks. Nice to have exposure to TestNG , Selenium , or similar automation tools for testing UIs or APIs related to data validation. Cloud Technologies : Deep understanding of the AWS ecosystem , especially around ETL and data services. Familiarity with orchestration (e.g., Step Functions, Lambda), security, and logging. Health Check Automation : Build SQL and Python-based health check scripts to monitor pipeline sanity and data integrity. Reporting Tools (Nice to have): Exposure to tools like Jaspersoft , Tableau , Power BI , etc. for report layout and aggregation validation. Root Cause Analysis : Strong debugging skills to trace data discrepancies and report logical/data errors to development teams. Communication : Must be able to communicate clearly with both technical and non-technical stakeholders. Roles and Responsibilities Key Responsibilities: Design and execute test plans and test cases for validating ETL pipelines and data transformations. Ensure accuracy and integrity of data in transactional databases , staging zones , and data warehouses (Redshift) . Review stored procedures and SQL scripts to validate transformation logic. Automate ETL test scenarios using Python or other test automation tools as applicable. Implement health check mechanisms for automated validation of daily pipeline jobs. Investigate data issues and perform root cause analysis. Validate reports and dashboards, ensuring correct filters, aggregations, and visualizations. Collaborate with developers, analysts, and business teams to understand requirements and ensure complete test coverage. Report testing progress and results clearly and timely. Nice to Have: Web testing experience using Selenium or Appium. Experience in API testing and validation of data exposed via APIs.

Posted 1 month ago

Apply

8.0 - 12.0 years

30 - 40 Lacs

Pune

Work from Office

Assessment & Analysis Review CAST software intelligence reports to identify technical debt, architectural flaws, and cloud readiness. Conduct manual assessments of applications to validate findings and prioritize migration efforts. Identify refactoring needs (e.g., monolithic to microservices, serverless adoption). Evaluate legacy systems (e.g., .NET Framework, Java EE) for compatibility with AWS services. Solution Design Develop migration strategies (rehost, replatform, refactor, retire) for each application. Architect AWS-native solutions using services like EC2, Lambda, RDS, S3, and EKS. Design modernization plans for legacy systems (e.g., .NET Framework .NET Core, Java EE Spring Boot). Ensure compliance with AWS Well-Architected Framework (security, reliability, performance, cost optimization). Collaboration & Leadership Work with cross-functional teams (developers, DevOps, security) to validate designs. Partner with clients to align technical solutions with business objectives. Mentor junior architects and engineers on AWS best practices. Roles and Responsibilities Job Title: Senior Solution Architect - Cloud Migration & Modernization (AWS) Location: [Insert Location] Department: Digital Services Reports To: Cloud SL

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Ahmedabad

Work from Office

About the Role: Grade Level (for internal use): 09 S&P Global Market Intelligence The Role: Software Developer II (.Net Backend Developer) Grade ( relevant for internal applicants only )9 The Location: Ahmedabad, Gurgaon, Hyderabad The Team S&P Global Market Intelligence, a best-in-class sector-focused news and financial information provider, is looking for a Software Developer to join our Software Development team in our India offices. This is an opportunity to work on a self-managed team to maintain, update, and implement processes utilized by other teams. Coordinate with stakeholders to design innovative functionality in existing and future applications. Work across teams to enhance the flow of our data. Whats in it for you This is the place to hone your existing skills while having the chance to be exposed to fresh and divergent technologies. Exposure to work on the latest, cutting-edge technologies in the full stack eco system. Opportunity to grow personally and professionally. Exposure in working on AWS Cloud solutions will be added advantage. Responsibilities Identify, prioritize, and execute tasks in Agile software development environment. Develop solutions to develop/support key business needs. Engineer components and common services based on standard development models, languages and tools. Produce system design documents and participate actively in technical walkthroughs. Demonstrate a strong sense of ownership and responsibility with release goals. This includes understanding requirements, technical specifications, design, architecture, implementation, unit testing, builds/deployments, and code management. Build and maintain the environment for speed, accuracy, consistency and up time. Collaborate with team members across the globe. Interface with users, business analysts, quality assurance testers and other teams as needed. What Were Looking For Basic Qualifications: Bachelor's/Masters degree in computer science, Information Systems or equivalent. 3-5 years of experience. Solid experience with building processes; debugging, refactoring, and enhancing existing code, with an understanding of performance and scalability. Competency in C#, .NET, .NET CORE. Experience with DevOps practices and modern CI/CD deployment models using Jenkins Experience supporting production environments Knowledge of T-SQL and MS SQL Server Exposure to Python/scala/AWS technologies is a plus Exposure to React/Angular is a plus Preferred Qualifications: Exposure to DevOps practices and CI/CD pipelines such as Azure DevOps or GitHub Actions. Familiarity with automated unit testing is advantageous. Exposure in working on AWS Cloud solutions will be added to an advantage.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

What you will do In this vital role you will be responsible for designing, building, and maintaining scalable, secure, and reliable AWS cloud infrastructure. This is a hands-on engineering role requiring deep expertise in Infrastructure as Code (IaC), automation, cloud networking, and security . The ideal candidate should have strong AWS knowledge and be capable of writing and maintaining Terraform, CloudFormation, and CI/CD pipelines to streamline cloud deployments. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: AWS Infrastructure Design & Implementation Architect, implement, and manage highly available AWS cloud environments . Design VPCs, Subnets, Security Groups, and IAM policies to enforce security standard processes. Optimize AWS costs using reserved instances, savings plans, and auto-scaling . Infrastructure as Code (IaC) & Automation Develop, maintain, and enhance Terraform & CloudFormation templates for cloud provisioning. Automate deployment, scaling, and monitoring using AWS-native tools & scripting. Implement and manage CI/CD pipelines for infrastructure and application deployments. Cloud Security & Compliance Enforce standard processes in IAM, encryption, and network security. Ensure compliance with SOC2, ISO27001, and NIST standards. Implement AWS Security Hub, GuardDuty, and WAF for threat detection and response. Monitoring & Performance Optimization Set up AWS CloudWatch, Prometheus, Grafana, and logging solutions for proactive monitoring. Implement autoscaling, load balancing, and caching strategies for performance optimization. Solve cloud infrastructure issues and conduct root cause analysis. Collaboration & DevOps Practices Work closely with software engineers, SREs, and DevOps teams to support deployments. Maintain GitOps standard processes for cloud infrastructure versioning. Support on-call rotation for high-priority cloud incidents. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 4 to 6 years of experience in computer science, IT, or related field with hands-on cloud experience OR Bachelors degree and 6 to 8 years of experience in computer science, IT, or related field with hands-on cloud experience OR Diploma and 10 to 12 years of experience in computer science, IT, or related field with hands-on cloud experience Must-Have Skills: Deep hands-on experience with AWS (EC2, S3, RDS, Lambda, VPC, IAM, ECS/EKS, API Gateway, etc.) . Expertise in Terraform & CloudFormation for AWS infrastructure automation. Strong knowledge of AWS networking (VPC, Direct Connect, Transit Gateway, VPN, Route 53) . Experience with Linux administration, scripting (Python, Bash), and CI/CD tools (Jenkins, GitHub Actions, CodePipeline, etc.) . Strong troubleshooting and debugging skills in cloud networking, storage, and security . Preferred Qualifications: Good-to-Have Skills: Experience with Kubernetes (EKS) and service mesh architectures . Knowledge of AWS Lambda and event-driven architectures . Familiarity with AWS CDK, Ansible, or Packer for cloud automation. Exposure to multi-cloud environments (Azure, GCP) . Familiarity with HPC, DGX Cloud . Professional Certifications (preferred): AWS Certified Solutions Architect Associate or Professional AWS Certified DevOps Engineer Professional Terraform Associate Certification Soft Skills: Strong analytical and problem-solving skills. Ability to work effectively with global, virtual teams Effective communication and collaboration with cross-functional teams. Ability to work in a fast-paced, cloud-first environment.

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Hyderabad

Work from Office

We are seeking an MDM Associate Analyst with 2 5 years of development experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong experience on MDM (Master Data Management) on configuration (L3 Configuration, Assets creati on, Data modeling etc ) , ETL and data mappings (CAI, CDI ) , data mastering (Match/Merge and Survivorship rules) , source and target integrations ( RestAPI , Batch integration, Integration with Databricks tables etc ) Roles & Responsibilities: Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience: Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong experience with Informatica or Reltio MDM platforms in building configurations from scratch (Like L3 configuration or Data modeling, Assets creations, Setting up API integrations, Orchestration) Strong experience in building data mappings, data profiling, creating and implementation business rules for data quality and data transformation Strong experience in implementing match and merge rules and survivorship of golden records Expertise in integrating master data records with downstream systems Very good understanding of DWH basics and good knowledge on data modeling Experience with IDQ, data modeling and approval workflow/DCR. Advanced SQL expertise and data wrangling. Exposure to Python and PySpark for data transformation workflows. Knowledge of MDM, data governance, stewardship, and profiling practices. Good-to-Have Skills: Familiarity with Databricks and AWS architecture. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Basics of data engineering concepts. Professional Certifications : Any ETL certification (e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Title : AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) Req ID: 325686 We are currently seeking a AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) to join our team in Bangalore, Karntaka (IN-KA), India (IN). Minimum Experience on Key Skills - 5 to 10 years Skills: AWS, SQL, Snowflake, ControlM, ServiceNow - Operational Engineer (Weekend on call) We looking for operational engineer who is ready to work on weekends for oncall as primary criteria. Skills we look for AWS cloud (SQS, SNS, , DynomoDB, EKS), SQL (postgress, cassendra), snowflake, ControlM/Autosys/Airflow, ServiceNow, Datadog, Splunk, Grafana, python/shell scripting.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Req ID: 306668 We are currently seeking a Cloud Solution Delivery Sr Advisor to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 2+ years of experience inleading a team of data engineers - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification

Posted 1 month ago

Apply

2.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Req ID: 324959 We are currently seeking a L1 Cloud Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Cloud Platform / Infrastructure Engineer - Grade 6 - At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company"™s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring, the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. Preferred Experience As an L1 cloud engineer, should have good understanding of cloud platform, networking, and storage principles with focus on Azure. Cloud administration, maintenance, and troubleshooting experience. Monitor cloud and infrastructure services to ensure uninterrupted operations. Monitor and manage support tickets during assigned shifts, ensuring timely and accurate resolution of issues. Respond to alerts and incidents, escalating to higher-level support as necessary. Able to provide shift hours support at L1 level Experience in updating KB articles and SOPs. Request additional information from clients, when necessary, to accurately diagnose and resolve issues. Acknowledge and analyse client emails to identify and understand issues. Provide clear guidance and relevant information to resolve first-level issues. Escalate complex issues to the internal L2 team and track the progress of these escalations to ensure prompt resolution. Well experienced in handling incident, service requests, change requests. Passion for delivering timely and outstanding customer service Great written and oral communication skills with internal and external customers Basic Qualifications 2+ years of overall operational experience 2+ years of Azure/AWS experience 2+ years of experience working in a diverse cloud support environment in a 24*7 production support model Preferred Certifications Azure Fundamentals AWS Cloud Practitioner Four Year BS/BA in Information Technology degree or equivalent experience

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 18 Lacs

Gurugram

Work from Office

Key Responsibilities and Requirements: More than 6 years of experience in backend /API testing for a payment system Develop comprehensive test strategies, plans, and test cases for complex backend Payment systems Backend efforts using Python, PyTest, Java, or similar languages. Automate API and Unit tests and integrate them with CI/CD pipelines for seamless deployments Deep knowledge of API and HTTP protocols, message queues, lambda functions, and AWS services. Understanding Linux, Cloud platforms (AWS), and CDNs is preferred, as well as familiarity with AWS log analysis. Work experience with Payment system testing is preferred.

Posted 1 month ago

Apply

7.0 - 12.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Req ID: 325298 We are currently seeking a AWS Redshift administrator Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Duties: "¢ Administer and maintain scalable cloud environments and applications for data organization. "¢ Understanding business objectives of the company and creating cloud-based solutions to facilitate those objectives. "¢ Implement Infrastructure as Code and deploy code using Terraform, Gitlab "¢ Install and maintain software, services, and application by identifying system requirements. "¢ Hands-on AWS Services and DB and Server troubleshooting experience. "¢ Extensive database experience with RDS, AWS Redshift, MySQL "¢ Maintains environment by identifying system requirements, installing upgrades and monitoring system performance. "¢ Knowledge of day-to-day database operations, deployments, and development "¢ Experienced in Snowflake "¢ Knowledge of SQL and Performance tuning "¢ Knowledge of Linux Shell Scripting or Python "¢ Migrate system from one AWS cloud to another AWS account "¢ Hands-on DB and Server troubleshooting experience "¢ Maintains system performance by performing system monitoring and analysis and performance tuning. "¢ Troubleshooting system hardware, software, and operating and system management systems. "¢ Secures web system by developing system access, monitoring, control, and evaluation. "¢ Testing disaster recovery policies and procedures; completing back-ups; and maintaining documentation. "¢ Upgrades system and services and developing, testing, evaluating, and installing enhancements and new software. "¢ Communicating with internal teams, like EIMO, Operations, and Cloud Architect "¢ Communicate with stakeholders and build applications to meet project needs. Minimum Skills Required: "¢ Bachelor"™s degree in computer science or engineering "¢ Minimum of 7 years of experience in System, platform, and AWS cloud administration "¢ Minimum of 5 to 7 years of Database administration and AWS experience using latest AWS technologies "“ AWS EC2, Redshift, VPC, S3, AWS RDS "¢ Experience with Java, Python, Redshift, MySQL, or equivalent database tools "¢ Experience with Agile software development using JIRA "¢ Experience in multiple OS platforms with strong emphasis on Linux and Windows systems "¢ Experience with OS-level scripting environment such as KSH shell., PowerShell "¢ Experience with version management tools and CICD pipeline "¢ In-depth knowledge of the TCP / IP protocol suite, security architecture, securing and hardening Operating Systems, Networks, Databases and Applications. "¢ Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) , query performance tuning. "¢ Experience supporting and optimizing data pipelines and data sets. "¢ Knowledge of the Incident Response life cycle "¢ AWS solution architect certifications. "¢ Strong written and verbal communication skills.

Posted 1 month ago

Apply

1.0 - 6.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Req ID: 328302 We are currently seeking a AWS Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title: Digital Engineering Sr Associate NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Basic Qualifications 1 years' experience in AWS Infra Preferred Experience Excellent communication and collaboration skills. AWS certifications are preferred. Expertise in AWS cloud EC2, creating, Managing, Patching, trouble shooting. Good Knowledge on Access and Identity Management Monitoring Tools - CloudWatch (New Relic/other monitoring), logging AWS Storage "“ EBS, EFS, S3, Glacier, Adding the disk, extending the disk. AWS backup and restoration Strong understanding of networking concepts to create VPC, Subnets, ACL, Security Groups, and security best practices in cloud environments. Knowledge of PaaS to IaaS migration strategies Scripting experience (must be fluent in a scripting language such as Python) Detail-oriented self-starter capable of working independently. Knowledge of IaaC Terraform and best practice. Experience with container orchestration utilizing ECS, EKS, Kubernetes, or Docker Swarm Experience with one or more of the following Configuration Management ToolsAnsible, Chef, Salt, Puppet infrastructure, networking, AWS databases. Familiarity with containerization and orchestration tools, such as Docker and Kubernetes. Bachelor"™s degree in computer science or a related field Any of the AWS Associate Certifications GCP Knowledge Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Ideal Mindset Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Listener. You listen to the needs of the customer and make those the priority throughout development.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Noida, Bengaluru

Work from Office

Req ID: 304647 We are currently seeking a AWS Lead Engineer to join our team in Remote, Karntaka (IN-KA), India (IN). Basic Qualifications 3 years' experience in AWS Infra Preferred Experience Excellent communication and collaboration skills. AWS certifications are preferred. Expertise in AWS cloud EC2, creating, Managing, Patching, trouble shooting. Good Knowledge on Access and Identity Management Monitoring Tools - CloudWatch (New Relic/other monitoring), logging AWS Storage "“ EBS, EFS, S3, Glacier, Adding the disk, extending the disk. AWS backup and restoration Strong understanding of networking concepts to create VPC, Subnets, ACL, Security Groups, and security best practices in cloud environments. Strong knowledge of PaaS to IaaS migration strategies Scripting experience (must be fluent in a scripting language such as Python) Detail-oriented self-starter capable of working independently. Knowledge of IaaC Terraform and best practice. Experience with container orchestration utilizing ECS, EKS, Kubernetes, or Docker Swarm Experience with one or more of the following Configuration Management ToolsAnsible, Chef, Salt, Puppet infrastructure, networking, AWS databases. Familiarity with containerization and orchestration tools, such as Docker and Kubernetes. Bachelor"™s degree in computer science or a related field Any of the AWS Associate Certifications Ideal Mindset Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Listener. You listen to the needs of the customer and make those the priority throughout development.

Posted 1 month ago

Apply

1.0 - 6.0 years

1 - 5 Lacs

Noida, Chennai, Bengaluru

Work from Office

Req ID: 328301 We are currently seeking a AWS Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title: Digital Engineering Sr Associate NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Basic Qualifications 1 years' experience in AWS Infra Preferred Experience Excellent communication and collaboration skills. AWS certifications are preferred. Expertise in AWS cloud EC2, creating, Managing, Patching, trouble shooting. Good Knowledge on Access and Identity Management Monitoring Tools - CloudWatch (New Relic/other monitoring), logging AWS Storage "“ EBS, EFS, S3, Glacier, Adding the disk, extending the disk. AWS backup and restoration Strong understanding of networking concepts to create VPC, Subnets, ACL, Security Groups, and security best practices in cloud environments. Knowledge of PaaS to IaaS migration strategies Scripting experience (must be fluent in a scripting language such as Python) Detail-oriented self-starter capable of working independently. Knowledge of IaaC Terraform and best practice. Experience with container orchestration utilizing ECS, EKS, Kubernetes, or Docker Swarm Experience with one or more of the following Configuration Management ToolsAnsible, Chef, Salt, Puppet infrastructure, networking, AWS databases. Familiarity with containerization and orchestration tools, such as Docker and Kubernetes. Bachelor"™s degree in computer science or a related field Any of the AWS Associate Certifications GCP Knowledge Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Ideal Mindset Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Listener. You listen to the needs of the customer and make those the priority throughout development.

Posted 1 month ago

Apply

7.0 - 12.0 years

16 - 20 Lacs

Pune

Work from Office

Req ID: 301930 We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Pune, Mahrshtra (IN-MH), India (IN). Position Overview We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders

Posted 1 month ago

Apply

5.0 - 10.0 years

13 - 17 Lacs

Pune

Work from Office

Req ID: 301172 We are currently seeking a Cloud Solution Delivery Lead Consultant to join our team in Pune, Mahrshtra (IN-MH), India (IN). Location - Remote AWS Lead Engineer will be required to design and build the cloud foundations platform. Translating project-specific needs into a cloud structure, design the cloud environment when required that covers all requirements with appropriate weightage given to the security aspect. Carryout deployment and integration of application in the designed cloud environment. Understand needs of business / client and implement cloud strategies those meet the needs. The candidate will also need good experience around software development principles, IaC and Github as devops tooling. Provide the necessary design to the team for building cloud infrastructure solutions, train and guide the team in provisioning/using/integrating the cloud services proposed in the design. Skills: Must have's 5+ years Proficient experience with AWS Cloud(AWS Core) 3+ years' relevant experience working on design cloud Infrastructure solution and cloud account migration Proficient in Cloud Networking and network configuration . Proficient in Terraform for managing Infrastructure as code (module based provisioning of infra, connectivity, provisioning of data services, monitoring services) Proficient in Github and Implementing CI/CD for infrastructure using IaC with Github Actions. AWS-CLI Have experience working with these AWS services: IAM Accounts, IAM Users & Groups, IAM Roles, Access Control RBAC, ABAC, Compute (EC2 and types and Costing), Storage (EBS, EFS,S3 etc), VPC, VPC Peering, Security Groups, Notification & Queue services, NACL, Auto Scaling Groups, CloudWatch, DNS, Application Load Balancer, Directory Services and Identity Federation, AWS Organizations and Control Tower, AWS Tagging Configuration, Certificate Management MVP Monitoring tool such as Amazon CloudWatch & hands-on with CloudWatch Logs. Examples of daily activities such as: - Account provisioning support - Policy Provisioning - Network support - Resource deployment support - Incident Support on daily work - Security Incident support DevOps experience o Github and github actions o Terraform o Python language o Go Language o Grafana o ArgoCD Nice to have's - Docker, - Kubernetes Able to work with Imperative and Declarative way to setup Kubernetes resources/Services

Posted 1 month ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are currently seeking a Lead Data Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Key Responsibilities - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem - Architect data processing applications using Python, Kafka, Confluent Cloud and AWS - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams - Provide technical leadership and mentorship to development teams and lead engineers - Stay current with emerging technologies and industry trends Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Strong experience with Confluent - Strong experience in Kafka - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Knowledge of Apache Airflow for data orchestration Preferred Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with Terraform - Deep experience with CI/CD pipelines - Strong understanding of the JVM language family - Understanding of GDPR and the correct handling of PII - Expertise with technical interface design - Use of Docker Responsibilities - Design and implement scalable data architectures using AWS services, Confluent and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture

Posted 1 month ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Noida

Work from Office

Req ID: 313916 We are currently seeking a Alation Admin or MSTR Cloud Admin to join our team in NOIDA, Uttar Pradesh (IN-UP), India (IN). Alation Admin Also known as an Alation Data Catalog Administrator Responsible for managing and maintaining the Alation Data Catalog, a platform that helps organizations discover, understand, and govern their data assets. 1. Platform Administration Installing, configuring, and maintaining the Alation Data Catalog platform to ensure its optimal performance and reliability. 2. User Management Managing user access, permissions, and roles within Alation, ensuring proper authentication and authorization for data access. 3. Data Governance Implementing and enforcing data governance policies, including data classification, data lineage, and data stewardship, to maintain data quality and compliance. 4. Data Catalog Management Curating and organizing metadata and data assets within the catalog, ensuring accurate and up-to-date information is available to users. 5. Integration: Collaborating with other IT teams to integrate Alation with data sources, databases, data lakes, and other data management systems. 6. Metadata Management Overseeing the extraction and ingestion of metadata from various data sources into Alation, including data dictionaries, business glossaries, and technical metadata. 7. Security Implementing and maintaining security measures, such as encryption, access controls, and auditing, to protect sensitive data and catalog information. 8. Training and Support Providing training to users on how to effectively use the Alation Data Catalog and offering support for catalog-related inquiries and issues. 9. Data Discovery Assisting users in discovering and accessing data assets within the catalog, promoting self-service data discovery. 10. Collaboration Collaborating with data owners, data stewards, and data users to understand their data needs and ensure the catalog meets those requirements. 11. Performance Monitoring Monitoring the performance of the Alation Data Catalog platform, identifying and resolving issues to ensure optimal functionality. 12. Upgrades and Maintenance Planning and executing platform upgrades and applying patches to stay up to date with Alation releases. 13. Documentation Maintaining documentation for catalog configurations, processes, and best practices. 14. Reporting and Analytics Generating reports and insights from Alation to track data usage, data lineage, and user activity. 15. Data Quality Monitoring and improving data quality within the catalog and assisting in data quality initiatives. 16. Stay Current Staying informed about Alation updates, new features, and industry best practices in data catalog administration. An Alation Admin plays a critical role in enabling organizations to effectively manage their data assets, foster data collaboration, and ensure data governance and compliance across the enterprise. --------------------------- MicroStrategy Cloud Admin Minimum 4+ years of MSTR Administration with following core attributes "“ Hands-on maintenance and administration experience in MicroStrategy 10.x Business Intelligence product suite, AWS Cloud platform Experience on enterprise portal integration, mobile integration, write back to source data based on analysis by business users, alerts via mail, mobile based on pre-defined events Ability to define and review complex Metric Ability to architect MSTR Cubes for solving complex business problems Good conceptual knowledge and working experience on meta data creation framework models universe etc., creating report specifications, integration test planning & testing, unit test planning & testing, UAT & implementation support Strong knowledge of quality processes SDLC, Review, Test, Configuration Management, Release Management, Defect Prevention Knowledge of Database is essential and ability to review the SQL passes and make decisions based on query timings Good experience on MicroStrategy upgrade, configurations on Linux Have hands on experience on creating MSTR deployment packages, Command manager scripts setup and maintain proper object and data security working experience of Configure, maintain, and administer multiple environments Open to work in different shifts as per project need Excellent Communication Skills (Written, Verbal, team work and issue resolution) Activities: Provide support for MicroStrategy"™s Business Intelligence product suite, AWS Cloud platform, and its underlying technologies Use your strong communication skills Resolve application and infrastructure situations as they arise Perform day to day management of the MicroStrategy Cloud infrastructure (on AWS) including alert monitoring/remediation, change management, incident troubleshooting and resolution. Participate in scheduled and emergency infrastructure maintenance activities Collaborate and communicate effectively with peers, internal application and software development teams Maintain high quality documentation for all related tasks Work in a strong team environment Independently manage Production MSTR environment (and associated lower environments as Dev, UAT) Manage upgrades and vulnerabilities

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Req ID: 306669 We are currently seeking a Lead Data Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Position Overview We are seeking a highly skilled and experienced Lead Data/Product Engineer to join our dynamic team. The ideal candidate will have a strong background in streaming services and AWS cloud technology, leading teams and directing engineering workloads. This is an opportunity to work on the core systems supporting multiple secondary teams, so a history in software engineering and interface design would be an advantage. Key Responsibilities Lead and direct a small team of engineers engaged in - Engineering reuseable assets for the later build of data products - Building foundational integrations with Kafka, Confluent Cloud and AWS - Integrating with a large number of upstream and downstream technologies - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 5+ years of experience in data engineering - 3+ years of experience with real time (or near real time) streaming systems - 2+ years of experience leading a team of data engineers - A willingness to independently learn a high number of new technologies and to lead a team in learning new technologies - Experience in AWS cloud services, particularly Lambda, SNS, S3, and EKS, API Gateway - Strong experience with Python - Strong experience in Kafka - Excellent understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts both directly and through documentation - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of cloud networking patterns and practises - Experience with working on a library or other long term product - Knowledge of the Flink ecosystem - Experience with terraform - Experience with CI pipelines - Ability to code in a JVM language - Understanding of GDPR and the correct handling of PII - Knowledge of technical interface design - Basic use of Docker

Posted 1 month ago

Apply

5.0 - 8.0 years

0 - 0 Lacs

Hyderabad

Work from Office

Business Analyst : Must have Strong Communication with Good and quick understanding of Product Vision and Domain. AWS Cloud knowledge is good to have Location : Hyderabad Experience - 5 to 8 Notice Period : Immediate joiners

Posted 1 month ago

Apply

6.0 - 10.0 years

13 - 20 Lacs

Bengaluru

Hybrid

Job Description -6+ years of experience in backend development using Java. -Strong expertise in Spring Boot, Spring Cloud, and building Microservices. -Experience with REST APIs, JSON, and API integration. -Good knowledge of AWS services for deployment, storage, and compute. -Familiarity with CI/CD pipelines and tools like Jenkins, Git, Maven/Gradle. -Understanding of containerization using Docker and orchestration with Kubernetes (nice to have). -Experience with relational and NoSQL databases (e.g., MySQL, PostgreSQL, DynamoDB, MongoDB). -Solid understanding of application performance monitoring and logging tools.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies