GCP Data Engineer-Remote (Contract)

0 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Part Time

Job Description

Company Description

ThreatXIntel is a startup cyber security company specializing in protecting businesses and organizations from cyber threats. Our tailored services include cloud security, web and mobile security testing, cloud security assessment, and DevSecOps. We prioritize delivering affordable solutions that cater to the specific needs of our clients, regardless of their size. Our proactive approach to security involves continuous monitoring and testing to identify vulnerabilities before they can be exploited.


Role Description

GCP Data Engineer

BigQuery

Key Responsibilities

  • Design, develop, and maintain scalable

    data pipelines and ETL/ELT workflows

    using

    GCP Composer (Apache Airflow)

  • Work with

    BigQuery

    ,

    Cloud SQL

    , and

    PostgreSQL

    to manage and optimize data storage and retrieval
  • Build automation scripts and data transformations using

    Python

    (PySpark knowledge is a strong plus)
  • Optimize queries for large-scale, distributed data processing systems
  • Collaborate with cross-functional teams to translate business and analytics requirements into scalable technical solutions
  • Support data ingestion from multiple structured and semi-structured sources including

    Hive

    ,

    MySQL

    , and

    NoSQL

    databases
  • Apply HDFS and distributed file system experience where necessary
  • Ensure data quality, reliability, and consistency across platforms
  • Provide ongoing maintenance and support for deployed pipelines and services

Required Qualifications

  • Strong hands-on experience with

    GCP services

    , particularly:
  • BigQuery

  • Cloud Composer (Apache Airflow)

  • Cloud SQL / PostgreSQL

  • Proficiency in

    Python

    for scripting and data pipeline development
  • Experience in designing & optimizing high-volume

    data processing workflows

  • Good understanding of

    distributed systems

    ,

    HDFS

    , and

    parallel processing frameworks

  • Strong analytical and problem-solving skills
  • Ability to work independently and collaborate across remote teams
  • Excellent communication skills for technical and non-technical audiences

Preferred Skills

  • Knowledge of

    PySpark

    for big data processing
  • Familiarity with

    Hive

    ,

    MySQL

    , and

    NoSQL databases

  • Experience with

    Java

    in a data engineering context
  • Exposure to data governance, access control, and cost optimization on GCP
  • Prior experience in a

    contract or freelance capacity

    with enterprise clients


Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You