Jobs
Interviews

1102 Bigquery Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable solutions using Google BigQuery. Your typical day will involve collaborating with the team, analyzing business requirements, designing and implementing application features, and ensuring the applications meet quality standards and performance goals. Roles & Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment. Professional & Technical Skills:Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Must To Have Skills:Proficiency in Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

6.0 - 8.0 years

10 - 12 Lacs

Pune

Remote

Job Responsibilities : Design and Develop Dashboards: Create visually appealing and interactive dashboards that help users quickly grasp critical insights from data. Data Integration : Connect Tableau to various data sources (Google Big Query/Azure Synapse, etc.), ensuring data accuracy and integrity. Performance Optimization : Improve load times and responsiveness of Tableau dashboards and reports. Data Analysis: Analyse and interpret data to create meaningful visualizations. Collaboration: Work closely with stakeholders to understand and translate business requirements into functional specifications. Skills: Proficiency in Tableau : Strong understanding of Tableau Desktop and Tableau Server and respective functionalities. Minimum 5 years of experience in Tableau. JavaScript: Must have experience in customising (handling of events, filters, etc.) the Tableau dashboards using Tableau Embedding API in JavaScript. Technical Skills : Knowledge of SQL and should have experience connecting to a Data warehouse. Cloud: Experience working in a cloud-based environment. Communication: Excellent communication skills to effectively collaborate with stakeholders and present data insights. Education: Bachelors degree in computer science, Information Systems, or a related field (or equivalent experience).

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

GCP Looker Developer GCP Business Data Analyst Data Engineer - GCP - Big Query GCP Application Developer Required Candidate profile Good knowledge GCP Looker Developer GCP Business Data Analyst Data Engineer - GCP - Big Query GCP Application Developer

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Chennai, Bengaluru

Work from Office

We are looking for a Senior GCP Data Engineer / GCP Technical Lead with strong expertise in Google Cloud Platform (GCP), Apache Spark, and Python to join our growing data engineering team. The ideal candidate will have extensive experience working with GCP data services and should be capable of leading technical teams, designing robust data pipelines, and interacting directly with clients to gather requirements and ensure project delivery. Project Duration : 1 year and extendable Role & responsibilities Design, develop, and deploy scalable data pipelines and solutions using GCP services like DataProc and BigQuery. Lead and mentor a team of data engineers to ensure high-quality deliverables. Collaborate with cross-functional teams and client stakeholders to define technical requirements and deliver solutions aligned with business goals. Optimize data processing and transformation workflows for performance and cost-efficiency. Ensure adherence to best practices in cloud data architecture, data security, and governance. Mandatory Skills: Google Cloud Platform (GCP) especially DataProc and BigQuery Apache Spark Python Programming Preferred Skills: Experience in working with large-scale data processing frameworks. Exposure to DevOps/CI-CD practices in a cloud environment. Hands-on experience with other GCP tools like Cloud Composer, Pub/Sub, or Cloud Storage is a plus. Soft Skills: Strong communication and client interaction skills. Ability to work independently and as part of a distributed team. Excellent problem-solving and team management capabilities.

Posted 1 month ago

Apply

5.0 - 8.0 years

1 - 1 Lacs

Bengaluru

Remote

Note: (Big query and JDK) must have for this role Scope of the project • Strong proficiency in Java (including reading and writing complex code) • Experience with Spring Framework and MVC architecture • Proficient in Maven, Tomcat, and Java 17

Posted 1 month ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Kolkata

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP Ariba Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring that business operations run smoothly and efficiently. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of application support processes, all while maintaining a focus on delivering exceptional service to stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor system performance and proactively address potential issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Ariba.- Strong understanding of application support processes and methodologies.- Experience with troubleshooting and resolving software issues.- Familiarity with system integration and data flow management.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 5 years of experience in SAP Ariba.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 22 Lacs

Chennai, Bengaluru

Work from Office

•Minimum 4+ years of experience implementing data migration programs from Hadoop with Java, Spark to GCP BigQuery and Dataproc •Minimum experience of 4+ years in Integrating plugins for GitHub Action to CICD platform to ensure software quality

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Cloud Platform Architecture Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing innovative solutions to enhance user experience and streamline processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and develop applications.- Implement best practices for application development.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated with the latest technologies and trends in application development.- Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Platform Architecture.- Strong understanding of cloud computing principles.- Experience with designing scalable and secure cloud-based applications.- Hands-on experience with Google Cloud services such as Compute Engine, BigQuery, and Cloud Storage.- Knowledge of DevOps practices for continuous integration and deployment. Additional Information:- The candidate should have a minimum of 3 years of experience in Google Cloud Platform Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the defined requirements effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good to have- SAP ABAP, CDP views- Strong understanding of data modeling concepts and best practices.- Experience with application design methodologies and tools.- Ability to analyze and interpret complex business requirements.- Familiarity with integration techniques and data flow management. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad

Remote

Job Description: We are seeking experienced Python developers with hands-on expertise in Google Cloud Vertex AI. The ideal candidate will have a strong background in machine learning model development, deployment pipelines, and cloud-native applications. Key Skills: Advanced proficiency in Python Experience with Vertex AI (training, deployment, pipelines, model registry) Familiarity with Google Cloud Platform (GCP) services like BigQuery, Cloud Functions, AI Platform Understanding of ML lifecycle, including data preprocessing, training, evaluation, and monitoring CI/CD experience with ML workflows (e.g., Kubeflow, TFX, or Vertex Pipelines) Preferred: Experience integrating Vertex AI with DBT, Airflow, or Looker Exposure to MLOps and model governance

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

Position Description: Your future duties and responsibilities: Job Description Candidate will have a strong background in .NET Core, Angular, and possess experience working within the media domain. Additionally, experience with TM Forum Open APIs and GCP skills would be a significant advantage. Responsibilities: Design, develop, and maintain high-quality .NET Core applications using best practices and industry standards. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Leverage Angular framework to build robust and user-friendly web interfaces. Integrate with TM Forum Open APIs to facilitate interoperability with other systems. Utilize GCP services and technologies to optimize application performance and scalability. Provide technical guidance and mentorship to junior team members. Stay updated on the latest .NET Core, Angular, and media adtech trends and technologies. Required Skills and Experience: Strong proficiency in .NET Core and C# programming languages. In-depth knowledge of Angular framework and its ecosystem. Experience working with media adtech platforms or related domains. Understanding of TM Forum Open APIs and their applications. Proficiency in GCP services and technologies (e.g., Cloud Functions, App Engine, BigQuery). Excellent problem-solving and debugging skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Experience in versioning tools GitLab, TFS Preferred Skills and Experience: Experience with containerization technologies (e.g., Docker, Kubernetes). Knowledge of cloud-native development practices. Experience with microservices architecture. Familiarity with Agile methodologies (e.g., Scrum, Kanban). Skills: Angular .NET .Net Remoting .Net Reporting SQLite Telecommunications.

Posted 1 month ago

Apply

7.0 - 9.0 years

13 - 17 Lacs

Chennai

Work from Office

Key Responsibilities: Design and implement scalable and efficient full-stack solutions using Java and cloud technologies. Develop and maintain cloud-based solutions on Google Cloud Platform (GCP), utilizing services like BigQuery, Astronomer, Terraform, Airflow, and Dataflow. Architect and implement complex data engineering solutions using GCP services. Collaborate with cross-functional teams to develop, deploy, and optimize cloud-based applications. Utilize Python for data engineering and automation tasks within the cloud environment. Ensure alignment with GCP architecture best practices and contribute to the design of high-performance systems. Lead and mentor junior developers, fostering a culture of learning and continuous improvement. Required Skills: Full-Stack Development (7+ years): Strong expertise in full-stack Java development with experience in building and maintaining complex web applications. Google Cloud Platform (GCP): Hands-on experience with GCP services like BigQuery, Astronomer, Terraform, Airflow, Dataflow, and GCP architecture. Python: Proficiency in Python for automation and data engineering tasks. Cloud Architecture: Solid understanding of GCP architecture principles and best practices. Strong problem-solving skills and ability to work in a dynamic, fast-paced environment.

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Nagpur

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred.

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Guwahati

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Kochi

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Kanpur

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Chennai

Work from Office

What youll be doing: The Wireless Solution Train supports critical network functions and services for 4G/5G wireless applications. We are looking for a dynamic and collaborative individual who will contribute to the growth and evolution of Next Gen OSS for Network systems. Planning, designing, developing, coding and testing software systems or applications for software enhancements and new products; revise and refine as required. Implementing changes and new features in a manner which promotes efficient, reusable and performant code. Participating in product feature implementation, both independently and in cooperation with the team. Maintaining and improve existing code with a pride of ownership. Leading medium to large scale projects with minimal direction. Design, develop, and maintain data pipelines using GCP services such as BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Dataproc. Implement and manage data ingestion processes from various sources (e.g., databases, APIs, streaming platforms). What were looking for... You'll need to have: Bachelor's degree or four or more years of work experience. Four or more years of relevant work experience Experience in Python, Pyspark/Flink Experience on Product Agile model (POD) and have product mindset. GCP experience on BQ, Spanner, Looker Experience in GEN AI solutions & tools Even better if you have one or more of the following: Masters degree or rleted field. Any relevant certification. Excellent communication and collaboration skills. Develop and maintain data quality checks and monitoring systems to ensure data accuracy and integrity. Optimize data pipelines for performance, scalability, and cost-effectiveness. Collaborate with data scientists and analysts to understand data requirements and provide data solutions. Build and maintain Looker dashboards and reports for data visualization and analysis. Stay up-to-date with the latest technologies and best practices in cloud data engineering. (preferred GCP)

Posted 1 month ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Gurugram

Remote

Dear Candidate, Greetings from A2Z HR Consultants Base Job Location: Gurgaon Shift timings: Dubai Time Zone (General Shifts) Number of working days: 5 Mode of Work: Remote Salary range: Upto 25 LPA fixed Role: Digital Analytics Job Description: We are seeking an experienced and detail-oriented App Analytics Implementation Specialist to join our team. The ideal candidate will be responsible for ensuring the accuracy and effectiveness of analytics implementations for mobile applications. This includes performing quality assurance (QA) on Firebase Analytics app events, Adjust SDK integration, and GA4 reporting. You will work closely with cross-functional teams to ensure data integrity, validate event tracking, and support reporting initiatives. Responsibilities: Firebase Analytics App Events QA: Test and verify the integration of Firebase Analytics within mobile applications. Ensure proper tracking of app events, including user interactions, sessions, and in-app behavior. Collect Firebase Analytics logs from Android Studio logcat and Xcode from build branches shared by developers. Validate each event and its associated parameters to ensure data accuracy and completeness. Perform troubleshooting and ensure accurate event data collection for reporting. Work with developers and product teams to resolve discrepancies and ensure correct event firing across all devices. Adjust Implementation QA: Conduct thorough QA for Adjust SDK implementation across mobile platforms (iOS, Android). Verify and test in-app events, user attribution, and campaign tracking in Adjust. Perform troubleshooting of Adjust integrations and resolve any tracking issues. Collaborate with marketing and analytics teams to ensure proper attribution data collection for advertising campaigns. GA4 Reporting QA: Test Google Analytics 4 (GA4) setup and ensure accurate tracking of web/app data. Perform validation of GA4 custom events, eCommerce tracking, and user properties. Clear understanding of GA4 user properties, items parameters, and event parameters. Ensure that the tracking aligns with reporting requirements and provides accurate, actionable data. Collaborate with stakeholders to define reporting requirements and ensure GA4 dashboards are accurate. Troubleshoot and resolve data discrepancies in GA4 and ensure proper reporting of app metrics. Data Layer Creation: Create and maintain a data layer for app analytics based on new designs and requirements from the UX team. Ensure tracking of each call-to-action (CTA) and eCommerce event in detail, aligning the data layer with the product and user experience goals. Collaborate with UX, product, and development teams to ensure that tracking reflects the latest app design changes and meets analytics needs. Data Validation: Ensure analytics QA with high accuracy to maintain parity across all platforms Android, iOS, and Web . Perform data validation across GA4, Adjust, and internal databases to ensure consistency and accuracy across all analytics tools. Ensure proper alignment of event data and user interactions between platforms to guarantee reliable cross-platform analytics. MarTech Tools and Marketing Analytics: Knowledge of MarTech tools integrations and marketing analytics to support campaign tracking and reporting. Collaborate with marketing teams to ensure accurate data flow from external platforms (e.g., advertising networks, Braze CRM, Meta, Tiktok, Criteo, TradeDesk) into the analytics tools. Collaboration & Communication: Work closely with the development, marketing, and analytics teams to ensure alignment on tracking needs. Participate in sprint planning and help define test cases for tracking events and KPIs. Provide insights and recommendations to improve tracking efficiency and accuracy. Document QA test cases, issues, and resolutions effectively. Required Skills & Qualifications: Proven experience with Firebase Analytics and mobile app event tracking. Experience collecting Firebase Analytics logs from Android Studio logcat and Xcode from build branches and validating events and their parameters. Strong knowledge of Adjust SDK implementation and troubleshooting. Hands-on experience with Google Analytics 4 (GA4) and generating custom reports. Clear understanding of GA4 user properties, items parameters, and event parameters . Experience creating and maintaining a data layer for app analytics based on UX team designs and new app features. Strong expertise in analytics QA across multiple platforms (Android, iOS, Web) to ensure parity and consistency of data. Data validation experience across GA4, Adjust, and Big Query to ensure cross-platform consistency and accuracy. Knowledge of MarTech tools integrations and marketing analytics to support cross-channel attribution. Familiarity with mobile analytics platforms and performance metrics. Detail-oriented with strong QA skills, including creating test cases, performing tests, and documenting results. Ability to interpret data, identify trends, and communicate findings clearly. Strong problem-solving skills and the ability to troubleshoot analytics-related issues. Solid understanding of data flow between different platforms (Firebase, Adjust, GA4, etc.). Excellent communication and collaboration skills. Preferred Skills: Experience with Google Tag Manager (GTM) for mobile and web. Familiarity with BigQuery for advanced reporting and querying. Educational Requirements: Bachelors degree in Computer Science, Information Technology, Marketing, or a related field, or equivalent work experience. Interested candidates can share their CV at 9711831492 or gaurav.a2zhrconsultants@gmail.com Regards Gaurav Kumar A2Z HR Consultants

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Location : Bengaluru (Hybrid) / Remote Job Type : Full-time Experience Required : 5+ Year Notice Period : immediate -30 days Role Overview : As a Collibra Expert, you will be responsible for implementing, maintaining, and optimizing the Collibra Data Governance Platform to ensure data quality, governance, and lineage across the organization. You will partner with cross-functional teams to develop data management strategies and integrate Collibra solutions with Google Cloud Platform (GCP) to create a robust, scalable, and efficient data governance framework for the retail domain. Key Responsibilities : - Data Governance Management : Design, implement, and manage the Collibra Data Governance Platform for data cataloging, data quality, and data lineage within the retail domain. - Collibra Expertise : Utilize Collibra for metadata management, data quality monitoring, policy enforcement, and data stewardship across various business units. - Data Cataloging : Lead the implementation and continuous improvement of data cataloging processes to enable a centralized, user-friendly view of the organization's data assets. - Data Quality Management : Collaborate with business and technical teams to ensure that data is high-quality, accessible, and actionable. Define data quality rules and KPIs to monitor data accuracy, completeness, consistency, and timeliness. - Data Lineage Implementation : Build and maintain comprehensive data lineage models to visualize the flow of data from source to consumption, ensuring compliance with data governance standards. - GCP Integration : Architect and implement seamless integrations between Collibra and the Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, and Cloud Storage, ensuring data governance policies are enforced in the cloud environment. - Collaboration & Stakeholder Management : Collaborate with Data Engineers, Analysts, Business Intelligence teams, and leadership to define and implement data governance best practices and standards. - Training & Support : Provide ongoing training and support to business users and technical teams on data governance practices, Collibra platform usage, and GCP-based solutions. - Compliance & Security : Ensure data governance initiatives comply with internal policies, industry standards, and regulations (e.g., GDPR, CCPA). Key Requirements : - Proven Expertise in Collibra : Hands-on experience implementing and managing Collibra Data Governance Platform (cataloging, lineage, data quality). - Google Cloud Platform (GCP) Proficiency : Strong experience with GCP tools (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.) and integrating them with Collibra for seamless data governance. - Data Quality and Lineage Expertise : In-depth knowledge of data quality frameworks, metadata management, and data lineage implementation. - Retail Industry Experience : Prior experience in data governance within the retail or eCommerce domain is a plus. - Technical Skills : Strong understanding of cloud data architecture and best practices for managing data at scale in the cloud (preferably in GCP). - Problem-Solving and Analytical Skills : Ability to analyze complex data governance issues and find practical solutions to ensure high-quality data management across the organization. - Excellent Communication Skills : Ability to communicate effectively with both technical and non-technical stakeholders to advocate for data governance best practices. - Certifications : Relevant certifications in Collibra, Google Cloud, or Data Governance are highly desirable. Education & Experience : - Bachelor's degree (B. Tech/BE) mandatory, masters optional - 5+ years of experience in Data Governance, with at least 3 years of specialized experience in Collibra and GCP. - Experience working with data teams in a retail environment is a plus.

Posted 1 month ago

Apply

7.0 - 12.0 years

0 - 3 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Required Skills: Python, ETL, SQL, GCP, Bigquery, Pub/Sub, Airflow. Good to Have: DBT, Data mesh Job Title: Senior GCP Engineer Data Mesh & Data Product Specialist We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset – you take initiative without waiting for instructions. 2. A commitment to excellence – no shortcuts or compromises on quality. 3. Accountability – you own your work end-to-end and deliver on time. 4. Attention to detail – precision matters; mistakes are not acceptable.

Posted 1 month ago

Apply

10.0 - 14.0 years

10 - 16 Lacs

Pune

Work from Office

Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

17 - 20 Lacs

Kolkata

Work from Office

Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps

Posted 1 month ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Chandigarh

Work from Office

Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.

Posted 1 month ago

Apply

10.0 - 15.0 years

11 - 15 Lacs

Jhagadia

Work from Office

Develop, implement, and maintain the organization's MIS to ensure accurate and real-time reporting of key business metrics. Oversee the preparation and distribution of daily, weekly, and monthly reports to various departments and senior management. Ensure data accuracy, integrity, and consistency across all reporting platforms. Design and maintain dashboards for business performance monitoring. Analyze data trends and provide insights to management for informed decision-making. Establish and maintain cost accounting systems and procedures for accurate tracking of material, labor, and overhead costs. Review and update cost standards, analyzing variances and taking corrective actions when necessary. Collaborate with other departments to monitor and control project costs, ensuring alignment with budget and financial goals. Perform cost analysis and prepare cost reports to monitor financial performance and support pricing decisions. Conduct regular audits to ensure compliance with costing policies and industry standards. Provide regular cost analysis reports, highlighting variances between actual and budgeted figures, and recommend corrective actions. Support financial forecasting and budgeting processes by providing relevant data and insights. Assist in month-end and year-end closing processes by ensuring accurate costing and reporting entries. Review profitability analysis reports and identify areas for cost optimization.

Posted 1 month ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

Key Responsibilities Assist in building and maintaining data pipelines on GCP using services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc. Support data ingestion, transformation, and storage processes for structured and unstructured datasets. Participate in performance tuning and optimization of existing data workflows. Collaborate with data analysts, engineers, and stakeholders to ensure reliable data delivery. Document code, processes, and architecture for reproducibility and future reference. Debug issues in data pipelines and contribute to their resolution.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies