Jobs
Interviews

2 Aws Snowflake Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

Join a dynamic team shaping the tech backbone of our operations, where your expertise fuels seamless system functionality and innovation. As a member of our team, your primary responsibilities will include analyzing and troubleshooting production application flows to ensure end-to-end application or infrastructure service delivery supporting the business operations of the firm. You will play a key role in improving operational stability and availability through your participation in problem management. Monitoring production environments for anomalies and addressing issues utilizing standard observability tools will be crucial to your success in this role. Additionally, you will assist in the escalation and communication of issues and solutions to the business and technology stakeholders. Furthermore, identifying trends and helping in the management of incidents, problems, and changes in support of full stack technology systems, applications, or infrastructure will be part of your daily tasks. To excel in this role, you should possess a minimum of 2 years of experience or equivalent expertise in troubleshooting, resolving, and maintaining information technology services. Prior experience in a Customer or Client Facing related role will be advantageous. Proficiency with AWS Snowflake, AWS Splunk, Oracle Database, and SQL query experience writing and modifying complex queries is essential. Strong communication skills, organizational skills, and time management skills are highly valued. Knowledge of applications or infrastructure in a large-scale technology environment, whether on premises or public cloud, will be beneficial. Exposure to observability and monitoring tools and techniques is expected, along with familiarity with processes in scope of the Information Technology Infrastructure Library (ITIL) framework. Preferred qualifications for this role include knowledge of one or more general-purpose programming languages or automation scripting. Experience with help desk ticketing systems and the ability to influence and lead technical conversations with other resolver groups as directed are also desired. Exposure to observability and monitoring tools and techniques, as well as experience in Large Language Models (LLM) and Agentic AI, would be considered advantageous for this position.,

Posted 4 days ago

Apply

7.0 - 12.0 years

0 - 3 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role: Data Engineer Experience: 8+ Years Location: Hyderabad, Bengaluru, Chennai, Pune, Ahmedabad and Noida Notice Period: 30 Days What You'll Do Provide the organizations data consumers high quality data sets by data curation, consolidation, and manipulation from a wide variety of large scale (terabyte and growing) sources. Build first-class data products and ETL processes that interact with terabytes of data on leading platforms such as Snowflake and BigQuery. Partner with our Analytics, Product, CRM, and Marketing teams. Be responsible for the data pipelines SLA and dependency management. Write technical documentation for data solutions, and present at design reviews. Solve data pipeline failure events and implement anomaly detection. Work with various teams from Data Science, Product owners, to Marketing and software engineers on data solutions and solving technical challenges. Mentor junior members of the team What We Seek Education and Work Experience Bachelor’s degree in Computer Science or related field 8+ Years experience in commercial data engineering or software development. Tech Experience Experience with Big Data technologies such as Snowflake, Databricks, PySpark Expert level skills in writing and optimizing complex SQL; advanced data exploration skills with proven record of querying and analyzing large datasets. Solid experience developing complex ETL processes from concept to implementation to deployment and operations, including SLA definition, performance measurements and monitoring. Hands-on knowledge of the modern AWS Data Ecosystem, including AWS S3 Experience with relational databases such as Postgres, and with programming languages such as Python and/or Java Knowledge of cloud data warehouse concepts. Experience in building and operating data pipelines and products in compliance with the data mesh philosophy would be beneficial. Demonstrated efficiency in treating data, including data lineage, data quality, data observability and data discoverability. Communication/people skills Excellent verbal and written communication skills. Ability to convey key insights from complex analyses in summarized business terms to non-technical stakeholders and also ability to effectively communicate with other technical teams. Strong interpersonal skills and the ability to work in a fast-paced and dynamic environment. Ability to make progress on projects independently and enthusiasm for solving difficult problems. Role & responsibilities Preferred candidate profile

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies