Job opening For Data Engineer(BigQuery/Snowflake) @ GlobalData-Hyd

3 - 6 years

15 - 20 Lacs

Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Hello,

Urgent job openings for Data Engineer role @ GlobalData(Hyd).

if requirement is matching to your profile & interested to apply please share your updated resume @ mail id (m.salim@globaldata.com).

Mention Subject Line

Applying for Data Engineer @ GlobalData(Hyd)

Share your details in the mail :-

Full Name :

Mobile # :

Qualification :

Company Name :

Designation :

Total Work Experience Years :

How many years of experience working on Snowflake/Google BigQuery :

Current CTC :

Expected CTC :

Notice Period :

Current Location/willing to relocate to Hyd? :

Office Address

We are looking for a skilled and experienced Data Delivery Specification (DDS) Engineer to join our data team. The DDS Engineer will be responsible for designing, developing, and maintaining robust data pipelines and delivery mechanisms, ensuring timely and accurate data delivery to various stakeholders. This role requires strong expertise in cloud data platforms such as AWS, Snowflake, and Google BigQuery, along with a deep understanding of data warehousing concepts.

Key Responsibilities

  • Design, develop, and optimize data pipelines for efficient data ingestion, transformation, and delivery from various sources to target systems.
  • Implement and manage data delivery solutions using cloud platforms like AWS (S3, Glue, Lambda, Redshift), Snowflake, and Google BigQuery.
  • Collaborate with data architects, data scientists, and business analysts to understand data requirements and translate them into technical specifications.
  • Develop and maintain DDS documents, outlining data sources, transformations, quality checks, and delivery schedules.
  • Ensure data quality, integrity, and security throughout the data lifecycle.
  • Monitor data pipelines, troubleshoot issues, and implement solutions to ensure continuous data flow.
  • Optimize data storage and query performance on cloud data warehouses.
  • Implement automation for data delivery processes and monitoring.
  • Stay current with new data technologies and best practices in data engineering and cloud platforms.

Required Skills & Qualifications

  • Bachelors or Master’s degree in Computer Science, Data Engineering, or a related quantitative field.
  • 4+ years of experience in data engineering, with a focus on data delivery and warehousing.
  • Proven experience with cloud data platforms, specifically:
  • AWS:

    S3, Glue, Lambda, Redshift, or other relevant data services.
  • Snowflake:

    Strong experience with data warehousing, SQL, and performance optimization.
  • Google BigQuery:

    Experience with data warehousing, SQL, and data manipulation.
  • Proficient in SQL for complex data querying, manipulation, and optimization.
  • Experience with scripting languages (e.g., Python) for data pipeline automation.
  • Solid understanding of data warehousing concepts, ETL/ELT processes, and data modeling.
  • Experience with version control systems (e.g., Git).
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams

Thanks & Regards,

Salim

(Human Resources)

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Globaldata logo
Globaldata

Information Services

Melbourne Victoria

RecommendedJobs for You