Senior Data Architect

8 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title:  Senior Data Architect – Fintech Data Lakes

Location:

Department:

Reports to:


Role Highlights

  • Senior-level technical architect, strong cloud experience (GCP + Azure)
  • Specialized in 

    data lakes

    , compliance, real-time & batch pipelines
  • Financial services / fintech domain knowledge (e.g., ledgers, payment rails, PII compliance)
  • Expertise in SQL + Python/Scala/Java
  • Mentoring, governance, and cross-functional advisory

 

Factors Affecting Range

  • Strong cloud certifications (GCP, Azure Architect)
  • Deep domain in compliance frameworks (PCI, SOX, GLBA)
  • Hands-on vs. purely strategic


 

About the Role

We are seeking a highly experienced Senior Data Architect to lead the architecture and governance of our fintech data platforms, spanning Google Cloud Platform (GCP) for real-time production systems and Azure for regulatory and business reporting. This role is critical to building secure, governed, and scalable data lakes that support both operational finance systems and strategic analytics. 

You will be responsible for designing robust data architectures that ingest, process, and govern both structured data (e.g., transactions, accounts, ledgers) and unstructured data (e.g., scanned documents, KYC images, PDFs, voice logs)—ensuring compliance with financial regulations and enabling insights across the organization. 

 

Key Responsibilities

Data Lake & Architecture Strategy

  • Architect and maintain GCP-based production data lakes for real-time transactional ingestion and processing (e.g., payment processing, KYC, fraud detection). 
  • Design Azure-based reporting data lakes for BI, regulatory, and financial reporting workloads (e.g., ledger audits, compliance reports). 
  • Build multi-zone lake structures (raw, refined, curated) across both clouds, incorporating schema evolution, data contracts, and role-based access control. 

Financial Data Modeling & Pipeline Design

  • Model financial datasets (ledger data, user profiles, transactions, pricing) using dimensional, normalized, and vault approaches. 
  • Build and optimize real-time and batch pipelines with GCP (BigQuery, Pub/Sub, Dataflow) and Azure (Data Factory, Synapse, ADLS Gen2). 
  • Enable unified analytics on structured data (MySQL) and unstructured content (OCR’d documents, audio transcripts, logs). 

Compliance, Governance & Risk Controls

  • Implement data access, retention, and classification policies that meet regulatory requirements (GLBA, PCI-DSS, SOX, GDPR). 
  • Collaborate with infosec, legal, and audit teams to ensure auditability and lineage tracking across data flows. 
  • Define controls for PII, financial data sensitivity, and third-party data sharing. 

Cross-Functional Enablement

  • Serve as a technical advisor to business and compliance teams for data design and provisioning. 
  • Mentor data engineers and analysts on financial data structures, accuracy, and business rules. 
  • Help define enterprise standards for metadata, data cataloging, and data quality monitoring using tools like Azure Purview and GCP Data Catalog. 

 

Required Qualifications

  • 8+ years in data architecture, with significant experience in financial services, fintech, or banking environments. 
  • Strong experience with Google Cloud Platform (BigQuery, Dataflow, Cloud Storage, Pub/Sub) and Azure Data Lake / Synapse Analytics. 
  • Deep understanding of financial data modeling, including ledgers, double-entry accounting, payment rails, and regulatory audit structures. 
  • Experience with batch and streaming architectures, including handling of high-velocity financial transactions. 
  • Proficient in SQL and at least one programming language (Python, Scala, or Java). 
  • Strong understanding of data compliance frameworks, particularly in regulated financial environments. 

 



 

Preferred Qualifications

  • Prior experience with data lakehouse design using Delta Lake, Iceberg, or BigLake. 
  • Experience integrating data platforms with BI/reporting tools like Power BI, Looker, Tableau, or internal compliance dashboards. 
  • Familiarity with fraud analytics, anti-money laundering (AML) data flows, or KYC enrichment pipelines. 
  • Python programming, Web scraping, API integration, Data analysis, Machine learning, and Linux. 

 

What Success Looks Like

  • A resilient, compliant, and scalable data foundation that enables accurate financial operations and reporting. 
  • Efficient, observable data pipelines with proactive data quality monitoring and failure alerting. 
  • High trust in the reporting data lake from internal audit, compliance, and executive stakeholders. 
  • A streamlined data access and provisioning process that supports agility while meeting governance requirements. 


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You