GCP - Data Solution Architect

8 - 13 years

30 - 45 Lacs

Posted:3 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Data Solutions Architect Google Cloud Platform (GCP)

Role & responsibilities

  • Lead Data Architecture on GCP: Architect and design end-to-end data solutions leveraging a wide array of GCP services including BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Cloud Composer, and Data Catalog.
  • Data Platform Development & Optimization: Design, build, and maintain scalable, robust, and secure data platforms for ingestion, processing, storage, and analytics of all data types. Continuously optimize data warehouse and lakehouse performance through advanced schema design, partitioning, clustering, and query optimization techniques in BigQuery.
  • Solution Design & Implementation: Translate complex business requirements into clear technical architectures, detailed data models, and implementation plans, ensuring alignment with organizational standards and best practices.
  • Integration & ETL/ELT Pipelines: Develop, optimize, and orchestrate robust data pipelines using GCP-native tools like Dataflow, Dataproc, Dataprep, and Cloud Composer (Apache Airflow), along with integrating third-party ETL/ELT solutions.
  • Data Governance & Security: Define, implement, and enforce comprehensive data governance, security, and compliance frameworks across all GCP data assets, utilizing services such as IAM, encryption, Cloud DLP, VPC Service Controls, and audit logging.
  • Cloud Migration Leadership: Lead and support the migration of legacy data platforms to GCP, ensuring seamless transition, minimal disruption, and adherence to enterprise standards.
  • Technical Leadership & Mentorship: Provide strong technical leadership, guidance, and mentorship to data engineering teams, fostering adoption of best practices for data solution design, development, and deployment on GCP.
  • Innovation & Best Practices: Stay at the forefront of the latest GCP data technologies and industry trends. Champion and implement innovative solutions, architectural patterns, and best practices for cloud data architecture and engineering.
  • Collaboration & Stakeholder Management: Collaborate effectively with data engineers, analysts, data scientists, and business stakeholders at all levels to deliver high-quality, impactful data solutions.
  • Quality Assurance & Observability: Establish and enforce robust data validation, monitoring, and quality assurance processes to maintain high data integrity. Implement comprehensive monitoring, logging, and alerting solutions using tools like Cloud Monitoring and Cloud Logging.

Required Technical and Professional Expertise

  • Architectural Acumen:

    Deep understanding of cloud-native data architectures, distributed systems, streaming and batch data processing, and event-driven design patterns.
  • Problem-Solving & Analytical:

    Strong analytical mindset with exceptional problem-solving skills and meticulous attention to detail, with a bias towards performance, scalability, and cost optimization.
  • Communication & Influence:

    Excellent communication skills with the proven ability to present complex technical concepts clearly and concisely to diverse audiences, including technical teams, business stakeholders, and executive leadership, while effectively influencing technology direction.
  • Leadership & Collaboration:

    Strong leadership capabilities, demonstrated experience leading cross-functional technical teams, fostering collaboration, and driving consensus on architectural decisions.
  • Adaptability & Drive:

    Proven ability to manage multiple, concurrent priorities effectively in a fast-paced, dynamic environment with minimal supervision. A self-starter with a passion for continuous learning and innovation in the data and cloud space.
  • Security & Compliance:

    Expertise in data security, encryption, and compliance frameworks (e.g., GDPR, HIPAA, SOX) within a cloud environment

Preferred Technical and Professional Experience

  • Education

    : Bachelors or Master’s degree in Computer Science, Engineering, Information Systems, or a related technical field (or equivalent practical experience).
  • Experience:

    • 8+ years of extensive experience in data engineering, data architecture, or analytics.
    • At least 3 years in a dedicated data architect or solutions architect role.
    • Minimum of 5 years of hands-on experience designing and implementing enterprise-scale data solutions specifically on Google Cloud Platform.
  • GCP Expertise:

    Proven expert-level proficiency with core and advanced GCP data services, including but not limited to:
    • BigQuery (advanced SQL, optimization, data modeling, partitioning, clustering)
    • Cloud Storage (data lake design, lifecycle management)
    • Dataflow (Apache Beam for batch and streaming processing)
    • Dataproc (managed Apache Spark/Hadoop)
    • Pub/Sub (real-time messaging)
    • Cloud Composer (Apache Airflow for workflow orchestration)
    • Data Catalog (metadata management, data discovery)
    • Cloud Functions / Cloud Run (serverless compute for data processing)
    • Looker (BI integration)
  • Technical Proficiency

    :
    • Exceptional proficiency in SQL (with a focus on BigQuery optimization) and Python (including libraries like Pandas, NumPy, etc.).
    • Extensive experience with data modeling techniques (dimensional, Kimball, Inmon) and designing data warehousing/data lakehouse architectures.
    • Hands-on experience with ETL/ELT tools, orchestration frameworks, and API-driven data integration.
    • Proficiency with Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager for provisioning GCP resources.
    • Familiarity with event-driven architectures and messaging systems (e.g., Kafka).
    • Understanding of containerization technologies (Docker, Kubernetes, GKE) and CI/CD pipelines (e.g., Cloud Build, Cloud Deploy) for data workloads.
    • Exposure to NoSQL databases (e.g., Firestore, Bigtable, MongoDB) and various file formats (JSON, Avro, Parquet).
    • Knowledge of machine learning workflows and MLOps practices on GCP (e.g., Vertex AI) is a plus.
  • Certifications

    (Highly Preferred): Google Cloud Professional Data Engineer and/or Google Cloud Professional Cloud Architect.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Polestar logo
Polestar

Printing Services

RecommendedJobs for You

hyderabad, telangana, india