Jobs
Interviews

1192 Bigquery Jobs - Page 44

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

12 - 15 Lacs

Mumbai

Work from Office

Responsibilities: Develop and maintain data pipelines using GCP. Write and optimize queries in BigQuery. Utilize Python for data processing tasks. Manage and maintain SQL Server databases. Must-Have Skills: Experience with Google Cloud Platform (GCP). Proficiency in BigQuery query writing. Strong Python programming skills. Expertise in SQL Server. Good to Have: Knowledge of MLOps practices. Experience with Vertex AI. Background in data science. Familiarity with any data visualization tool.

Posted 2 months ago

Apply

2 - 4 years

3 - 8 Lacs

Kolkata

Remote

Data Quality Analyst Experience: 2 - 4 Years Exp Salary : Competitive Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Data Validation, BigQuery, SQL, Communication Skill Good to have skills : Data Visualisation, PowerBI, Tableau Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Short term objectives We know the importance data validation can play in creating better reporting for our business - we have identified areas we want you to make an impact within the first 3 months. Push 40% of partners through the ingestion validation process Push 40% of partners through the mapping validation process Data Team Culture Our team requires four areas of focus from every team member (see below). We use these focus areas to guide our decision making and career growth. To give you an idea of these requirements, the top three from each area are: Mastery: • Demonstrate skills expertise in relevant tool (e.g., GA, Tableau) or code language (e.g., SQL) • Think about the wider impact & value of decisions • Understand and anticipate the need for scalability, stability, and security Communication: • Provide clear, actionable feedback from peer reviews • Communicate effectively to wider teams and stakeholders • Proactively share knowledge everyday Ownership: • Lead complex initiatives that drive challenging goals • Create and push forward cross cutting concerns between teams • Demonstrate consistently sound judgement Behaviours: • Challenge yourself and others through questioning, assessing business benefits, and understanding cost of delay • Own your workload and decisions - show leadership to others • Innovate to find new solutions, or improve existing ways of working - push yourself to learn everyday Responsibilities: Reports directly to Senior Business Analyst and works closely with Data & Revenue Operations functions to support key deliverables Reconciliation of affiliate network revenue by vertical and publisher brand at monthly level Where discrepancies exist, investigation by to isolate whether specific days, products, providers, or commission values Validate new tickets going on to the Data Engineering JIRA board to ensure requests going into Data Engineering are complete, accurate and as descriptive as possible Investigation results to be updated into JIRA tickets and all outputs saved in mapping google sheet Use Postman API, Webhooks to pull revenue data from partner portals and verify against partner portals and BQ Monitor API failures, rate limits, and response inconsistencies impacting revenue ingestion. As necessary, seek revenue clarifications from the verticals RevOps team member As necessary, clarify JIRA commentary for data engineers Understand requirements, goals, priorities, and communicate to the stakeholders on progress towards data goals Ability to ensure outputs are on time and on target Required competencies: At least two (2) years of data quality analysis experience A strong understanding of SQL and how it can be used to validate data (experience with BigQuery is a plus) An understanding of large, relational databases and how to navigate these datasets to find the data required Ability to communicate data to non-technical audiences through the use of reports and visualisations Strong interpersonal and communication skills Comfortable working remotely and collaboratively with teammates across multiple geographies and time zones Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Forbes Advisor is a global platform dedicated to helping consumers make the best financial choices for their individual lives. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Patna

Work from Office

Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations.Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 2 months ago

Apply

6 - 11 years

10 - 18 Lacs

Noida, Indore

Work from Office

Role & responsibilities Job Description: We are looking for GCP Data Engineer and SQL Programmer with good working experience on PostgreSQL, & PL/SQL programming experience and following technical skills PL/SQL and PostgreSQL programming Ability to write complex SQL Queries, Stored Procedures. Migration Working experience in migrating Database structure and data from Oracle to Postgres SQL preferably on GCP Alloy DB or Cloud SQL Working experience on Cloud SQL/Alloy DB Working experience to tune autovacuum in postgresql. Working experience on tuning Alloy DB / PostgreSQL for better performance. Working experience on Big Query, Fire Store, Memory Store, Spanner and bare metal setup for PostgreSQL Ability to tune the Alloy DB / Cloud SQL database for better performance Experience on GCP Data migration service Working experience on MongoDB Working experience on Cloud Dataflow Working experience on Database Disaster Recovery Working experience on Database Job scheduling Working experience on Database logging techniques Knowledge of OLTP And OLAP Desirable: GCP Database Engineer Certification Other Skills:- Out of the Box Thinking Problem Solving Skills Ability to make tech choices (build v/s buy) Performance management (profiling, benchmarking, testing, fixing) Enterprise Architecture Project management/Delivery Capabilty/ Quality Mindset Scope management Plan (phasing, critical path, risk identification) Schedule management / Estimations Leadership skills Other Soft Skills Learning ability Innovative / Initiative Preferred candidate profile Roles & Responsibilities: Develop, construct, test, and maintain data architectures Migrate Enterprise Oracle database from On Prem to GCP cloud autovacuum in postgresql Ability to tune autovacuum in postgresql. Working on tuning Alloy DB / PostgreSQL for better performance. Performance Tuning of PostgreSQL stored procedure code and queries Converting Oracle stored procedure & queries to PostgreSQL stored procedures & Queries Creating Hybrid data store with Datawarehouse and No SQL GCP solutions along with PostgreSQL. Migrate Oracle Table data from Oracle to Alloy DB Leading the database team Mandatory Skills: Postgresql, plsql, Bigquery Bottom of Form

Posted 2 months ago

Apply

5 - 10 years

0 - 3 Lacs

Hyderabad

Hybrid

Job Profile We are seeking a Senior Data Engineer with proven expertise in designing and maintaining scalable, efficient, and reliable data pipelines. The ideal candidate should have strong proficiency in SQL, DBT, BigQuery, Python, and Airflow, along with a solid foundation in data warehousing principles. In this role, you will be instrumental in managing and optimizing data workflows, ensuring high data quality, and supporting data-driven decision-making across the organization. Experience with Oracle ERP systems and knowledge of data migration to a data warehouse environment will be considered a valuable advantage. Years of Experience: 5 to 10 Years. Shift Timings: 1PM to 10PM IST. Skill Set • SQL: Advanced proficiency in writing optimized queries, working with complex joins, CTEs, window functions, etc. • DBT (Data Build Tool): Experience in modelling data with dbt, managing data transformations, and maintaining project structure. Python: Proficient in writing data processing scripts and building Airflow DAGs using Python. BigQuery: Strong experience with GCPs BigQuery, including dataset optimization, partitioning, and query cost management. Apache Airflow: Experience building and managing DAGs, handling dependencies, scheduling jobs, and error handling. Data Warehousing Concepts: Strong grasp of ETL/ELT, dimensional modelling (star/snowflake), fact/dimension tables, slowly changing dimensions, etc. Version Control: Familiarity with Git/GitHub for code collaboration and deployment. • Cloud Platforms: Working knowledge of Google Cloud Platform (GCP). Job Description Roles & Responsibilities: Design, build, and maintain robust ETL/ELT data pipelines using Python, Airflow, and DBT. Develop and manage dbt models to enable efficient, reusable, and well-documented data transformations. Collaborate with stakeholders to gather data requirements and design data marts comprising fact and dimension tables in a well-structured star schema. Manage and optimize data models and transformation logic in BigQuery, ensuring high performance and cost-efficiency. Implement and uphold robust data quality checks, logging, and alerting mechanisms to ensure reliable data delivery. Maintain the BigQuery data warehouse, including routine optimizations and updates. Enhance and support the data warehouse architecture, including the use of star/snowflake schemas, partitioning strategies, and data mart structures. Proactively monitor and troubleshoot production pipelines to minimize downtime and ensure data accuracy.

Posted 2 months ago

Apply

3 - 8 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, SAS Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Pune

Work from Office

Role : Data Engineer We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Lucknow

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Bengaluru

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 2 months ago

Apply

5 - 9 years

18 - 20 Lacs

Noida

Work from Office

Experience: 5-7 Years Location-Noida Position: Data Analyst Technical Skills: Strong proficiency in Python (Pandas, NumPy, Matplotlib, Seaborn, etc.). Advanced SQL skills for querying large datasets. Experience with data visualization tools ( Looker, etc.). Hands-on experience with data wrangling, cleansing, and transformation. Familiarity with ETL processes and working with structured/unstructured data. Analytical & Business Skills: Strong problem-solving skills with the ability to interpret complex data. Business acumen to connect data insights with strategic decision-making. Excellent communication and presentation skills. Preferred (Nice to Have): Knowledge of machine learning concepts (scikit-learn, TensorFlow, etc.) Exposure to cloud platforms (GCP) for data processing.

Posted 2 months ago

Apply

4 - 9 years

15 - 19 Lacs

Pune

Work from Office

About The Role : Job Title: Technical-Specialist GCP Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Spark and GCP technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Google Cloud platform for at least 4 years. Hands own experience in Bigquery, Dataproc, Composer, Terraform, GKE, Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of Devops. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Knowledge on working with APIs and microservices , integrating external and internal web services including SOAP, XML, REST, JSON . Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 2 months ago

Apply

7 - 12 years

32 - 37 Lacs

Jaipur

Work from Office

About The Role : Job Title: Analytics Senior Analyst LocationJaipur, India Corporate TitleAVP Role Description You will be joining the Data & Analytics team as part of the Global Procurement division. The teams purpose is: Deliver trusted third-party data and insights to unlock commercial value and identify risk Develop and execute the Global Procurement Data Strategy Deliver the golden source of Global Procurement data, analysis and insights via dbPi, our Tableau self-service platform, leveraging automation and scalability on Google Cloud Provide data and analytical support to Global Procurement prioritised change initiatives The team leverages several tools and innovative techniques to create value added insights for stakeholders across end-to-end Procurement processes including, but not limited to, Third party Risk, Contracting, Spend, Performance Management, etc. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities You develop a sound understanding of the various tools and entire suite of analytical offerings on the standard procurement insights platform called dbPi. You support our stakeholders by understanding their requirements, challenge appropriately where needed in order to scope the porblem conceptualizing the optimum approach, and developing solutions using appropriate tools and visualisation techniques. You are comfortable leading small project teams in delivering the analytics change book of work, keeping internal and external stakeholders updated on the project progress while driving forward the key change topics For requests which are more complex in nature, you connect the dots and come up with a solution by establishing linkages across different systems and processes. You take end to end responsibility for any change request in the existing analytical product / dashboard starting from understanding the requirement, development, testing, QA and finally deliver it to stakeholders to their satisfaction. You are expected to deliver automation and Clean Data initiatives like deployment of Rules engine, Data quality checks enabled through Google cloud, bringing in Procurement data sources into GCP. You act as a thought partner to the Chief Information Offices deployment of Google Cloud Platform to migrate the data infrastructure layer (ETL processes) currently managed by Analytics team. You should be able to work in close collaboration with cross-functional teams, including developers, system administrators, and business stakeholders. Your skills and experience We are looking for talents with a Degree (or equivalent) in Engineering, Mathematics, Statistics, Sciences from an accredited college or university (or equivalent) to develop analytical solutions for our stakeholders to support strategic decision making. Any professional certification in Advanced Analytics, Data Visualisation and Data Science related domain is a plus. You have a natural curiosity for numbers and have strong quantitative & logical thinking skills. You ensure results are of high data quality and accuracy. You have working experience on Google Cloud and have worked with Cross functional teams to enable data source and process migration to GCP, you have working experience with SQL You are adaptable to emerging technologies like leveraging Machine Learning and AI to drive innovation. Procurement experience (useful --- though not essential) across vendor management, sourcing, risk, contracts and purchasing preferably within a Global and complex environment. You have the aptitude to understand stakeholders requirements, identify relevant data sources, integrate data, perform analysis and interpret the results by identifying trends and patterns. You enjoy the problem-solving process, think out of the box and break down a problem into its constituent parts with a view to developing end-to-end solution. You display enthusiasm to work in data analytics domain and strive for continuous learning and improvement of your technical and soft skills. You demonstrate working knowledge of different analytical tools like Tableau, Databases, Alteryx, Pentaho, Looker, Big Query in order to work with large datasets and derive insights for decision making. You enjoy working in a team and your language skills in English are convincing, making it easy for you to work in an international environment and with global, virtual teams How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 2 months ago

Apply

10 - 15 years

25 - 40 Lacs

Pune

Work from Office

Introduction: We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Solution Architect . As a Solution Architect, you will play a pivotal role in designing and implementing cloud-based solutions for our team using GCP. The ideal candidate will have a deep understanding of cloud architecture, a proven track record of delivering cloud-based solutions, and experience with GCP technologies. You will work closely with technical teams and clients to ensure the successful deployment and optimization of cloud solutions. Responsibilities: Lead the design and architecture of GCP-based solutions, ensuring scalability, security, performance, and cost-efficiency. Collaborate with business stakeholders, engineering teams, and clients to understand technical requirements and translate them into cloud-based solutions. Provide thought leadership and strategic guidance on cloud technologies, best practices, and industry trends. Design and implement cloud-native applications, data platforms, and microservices on GCP. Ensure cloud solutions are aligned with clients business goals and requirements, with a focus on automation and optimization. Conduct cloud assessments, identifying areas for improvement, migration strategies, and cost-saving opportunities. Oversee and manage the implementation of GCP solutions, ensuring seamless deployment and operational success. Create detailed documentation of cloud architecture, deployment processes, and operational guidelines. Engage in pre-sales activities, including solution design, proof of concepts (PoCs), and presenting GCP solutions to clients. Ensure compliance with security and regulatory requirements in the cloud environment. Requirements: At least 2+ years of experience as a Cloud Architect or in a similar role with strong expertise in Google Cloud Platform. In-depth knowledge of GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud Functions, and networking. Experience with infrastructure-as-code tools such as Terraform Strong understanding of cloud security, identity management, and compliance frameworks (e.g., GDPR, HIPAA). Hands-on experience with GCP networking, IAM, and logging/monitoring tools (Cloud Monitoring, Cloud Logging). Strong experience in designing and deploying highly available, fault-tolerant, and scalable solutions. Proficiency in programming languages like Java, Golang. Experience with containerization and orchestration technologies such as Docker, Kubernetes, and GKE (Google Kubernetes Engine). Experience in cloud cost management and optimization using GCP tools. Thanks, Pratap

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Mumbai

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Surat

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Kanpur

Work from Office

Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Hyderabad

Work from Office

Role : Data EngineerWe are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills.Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 2 months ago

Apply

5 - 10 years

20 - 35 Lacs

Bengaluru

Hybrid

GCP Data Engineer - 5+ Years of experience - GCP (all services needed for Big Data pipelines like BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, App Engine), Spark, Scala, Hadoop - Python, PySpark, Orchestration (Airflow), SQL CI/CD (experience with Deployment pipelines) Architecture and Design of cloud-based Big Data pipelines and exposure to any ETL tools Nice to Have - GCP certifications

Posted 2 months ago

Apply

3 - 5 years

9 - 18 Lacs

Chennai

Hybrid

Role & responsibilities As a part of Tonik Data Analytics team candidate will be responsible for Develop and enhance the Data Lake Framework for ingestion of data from various sources and providing reports to downstream systems/users. Work with the different Tonik IT Teams to implement the data requirements. Develop Data Pipelines based on requirement on Key GCP Services i.e. BigQuery, Airflow, GCS using Python/SQL language. Ensure the proper GCP Standards are followed for the implementation. Preferred candidate profile Handson experience in any one of the Programming languages (Python, Java). Working experience in cloud platform (AWS/GCP/Azure). Experience in Design pattern & designing scalable solutions Handson experience in SQL and able to convert the requirements to a standard, scalable, cost effective and with better performance. Should be aware of the Data Engineering principals and Data pipeline techniques. Communicate effectively with stakeholders and other team members. Implemented E2E automated pipeline to ingest the data from different formats (csv, fixed width, json etc.) Closely work with various business teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. Work with Agile and implementation approaches in the delivery. Experienced and should have hands-on experience in the following key offerings from GCP or equivalent services from AWS. Composer/Airflow, BigQuery, Dataflow, Cloud Storage, Apache Beam, Data Proc Should have good understanding on the Security related configuration and arrangements in BigQuery and how to handle the data securely while sharing. Nice to have exposure/knowledge on the ML and pipelines

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Nagpur

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Chennai

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 2 months ago

Apply

2 - 6 years

4 - 8 Lacs

Bengaluru

Work from Office

We are seeking a Data Analyst an experienced analytics professional who is passionate about unleashing the power of data to inform decision-making, achieve strategic objectives, and support hiring and retention of world-class talent. As an integral part of the team, the Data Analyst will use analytical skills and business acumen to turn data into knowledge and drive business success. Requirements and Qualifications: Minimum 5+ years of experience in Data Analytics, BI Analytics, or BI Engineering, preferably in a globally recognized organization. Expert-level proficiency in writing complex SQL queries to create views in data warehouses like Snowflake, Redshift, SQL Server, Oracle, or BigQuery. Advanced skills in designing and developing data models and dashboards using BI tools such as Tableau, Domo, Looker, etc. Intermediate-level skills with analytical tools such as Excel, Google Sheets, or Power BI (e.g., complex formulas, lookups, pivot tables). Bachelors or advanced degree in Data Analytics, Data Science, Information Systems, Computer Science, Applied Mathematics, Statistics, or a related field. Willingness to collaborate with internal team members and stakeholders across different time zones. Roles and Responsibilities: Perform advanced analytics such as cohort analysis, scenario analysis, time series analysis, and predictive analysis, and create powerful data visualizations. Clearly articulate assumptions, data interpretations, and analytical findings in a variety of formats for different audiences. Design data models that define the structure and relationship of data elements across various sources based on reporting and analytics needs. Collaborate with BI Engineers to build scalable, high-performing reporting and analytics solutions. Write SQL queries to extract and manipulate data from warehouses such as Snowflake. Conduct data validation and quality assurance checks to ensure high standards of data integrity. Investigate and resolve data issues, including root cause analysis when inconsistencies arise in reporting.

Posted 2 months ago

Apply

8 - 12 years

18 - 25 Lacs

Noida

Remote

Job Title: Data Modeler Enterprise Data Platform (BigQuery, Retail Domain) Location: Remote Duration: Contract Timing: US EST Hours We have an immediate need for an IT Data Modeler Enterprise Data Platform (BigQuery, Retail Domain) reporting to Enterprise Data Platform (EDP) Architecture Team. Job Summary: We are seeking an experienced Data Modeler to support the Enterprise Data Platform (EDP) initiative, focusing on building and optimizing curated data assets on Google BigQuery. This role requires expertise in data modeling, strong knowledge of retail data, and an ability to collaborate with data engineers, business analysts, and architects to create scalable and high-performing data structures. Key Responsibilities: 1. Data Modeling & Curated Layer Design Design logical, conceptual, and physical data models for EDPs curated layer in BigQuery. Develop fact and dimension tables, ensuring adherence to dimensional modeling best practices (Kimball methodology). Optimize data models for performance, scalability, and query efficiency in a cloud-native environment. Work closely with data engineers to translate models into efficient BigQuery implementations (partitioning, clustering, materialized views). 2. Data Standardization & Governance Define and maintain data definitions, relationships, and business rules for curated assets. Ensure data integrity, consistency, and governance across datasets. Work with Data Governance teams to align models with enterprise data standards and metadata management policies. 3. Collaboration with Business & Technical Teams Engage with business analysts and product teams to understand data needs, ensuring models align with business requirements. Partner with data engineers and architects to implement best practices for data ingestion and transformation. Support BI & analytics teams by ensuring curated models are optimized for downstream consumption (e.g., Looker, Tableau, Power BI, AI/ML models, APIs). Required Qualifications: 7+ years of experience in data modeling and architecture in cloud data platforms (BigQuery preferred). Expertise in dimensional modeling (Kimball), data vault, and normalization/denormalization techniques. Strong SQL skills, with hands-on experience in BigQuery performance tuning (partitioning, clustering, query optimization). Understanding of retail data models (e.g., sales, inventory, pricing, supply chain, customer analytics). Experience working with data engineering teams to implement models in ETL/ELT pipelines. Familiarity with data governance, metadata management, and data cataloging. Excellent communication skills and ability to translate business needs into structured data models. Prior experience working in retail or e-commerce data environments. Exposure to modern data architectures (data lakehouse, event-driven data processing). Familiarity with GCP ecosystem (Dataflow, Pub/Sub, Cloud Storage) and BigQuery security best practices. Thanks & Regards: Abhinav Krishna Srivastava Mob : +91- 9667680709 Email: asrivastava@fcsltd.com

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Ahmedabad

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities : - Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration. - Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows. - Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms. - SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently. - Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs. - Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications : - Proven experience as a Data Engineer or similar role. - Strong knowledge of Oracle to BigQuery data warehouse migration and modernization. - Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. - In-depth knowledge of Oracle DB and PL/SQL. - Excellent SQL writing skills. - Strong analytical and problem-solving abilities. - Ability to work collaboratively with cross-functional teams. - Excellent communication and interpersonal skills. Preferred Qualifications : - Experience with other data management tools and technologies. - Knowledge of cloud-based data solutions. - Certification in data engineering or related fields.

Posted 2 months ago

Apply

7 - 10 years

8 - 14 Lacs

Kolkata

Work from Office

We are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have a strong background in data engineering, with specific expertise in Oracle to BigQuery data warehouse migration and modernization. This role requires proficiency in various data engineering tools and technologies, including BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem. Key Responsibilities :- Oracle to BigQuery Migration: Lead the migration and modernization of data warehouses from Oracle to BigQuery, ensuring seamless data transfer and integration.- Data Engineering: Utilize BigQuery, DataProc, GCS, PySpark, Airflow, and Hadoop ecosystem to design, develop, and maintain scalable data pipelines and workflows.- Data Management: Ensure data integrity, accuracy, and consistency across various systems and platforms.- SQL Writing: Write and optimize complex SQL queries to extract, transform, and load data efficiently.- Collaboration: Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions that meet business needs.- Performance Optimization: Monitor and optimize data processing performance to ensure efficient and reliable data operations. Skills and Qualifications :- Proven experience as a Data Engineer or similar role.- Strong knowledge of Oracle to BigQuery data warehouse migration and modernization.- Proficiency in BigQuery, DataProc, GCS, PySpark, Airflow, and the Hadoop ecosystem.- In-depth knowledge of Oracle DB and PL/SQL.- Excellent SQL writing skills.- Strong analytical and problem-solving abilities.- Ability to work collaboratively with cross-functional teams.- Excellent communication and interpersonal skills. Preferred Qualifications :- Experience with other data management tools and technologies.- Knowledge of cloud-based data solutions.- Certification in data engineering or related fields.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies