Home
Jobs

41 Big Query Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

4 - 9 Lacs

Chennai

Work from Office

Naukri logo

Role & responsibilities React, JavaScript, Application Support, Big Query, Application Testing, Application Design, Coding, Angular, SPRING, Application Development, Developer, Java, Web Services

Posted 9 hours ago

Apply

4.0 - 9.0 years

14 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 4 to 5 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and implement data and semantic interoperability specifications. Work closely with business teams to define and scope requirements. Analyze existing systems to identify appropriate data sources and drive continuous improvement. Implement and continuously enhance automation processes for data ingestion and data transformation. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications: Overall 4-5 years of hands-on experience as a Data Engineer, with at least 3 years of direct GCP Data Engineering experience . Strong SQL and Python development skills are mandatory. Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies. Demonstrated knowledge and experience with Google Cloud BigQuery is a must. Experience with DataProc and Dataflow is highly preferred. Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks . Extensive experience in SQL across various database platforms. Experience in data mapping and data modeling . Familiarity with data analytics tools and best practices. Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell . Practical experience with Google Cloud services including but not limited to: Big Query , BigTable Cloud Dataflow , Cloud Data proc Cloud Storage , Pub/Sub Cloud Functions , Cloud Composer Cloud Spanner , Cloud SQL Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark ). Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker , etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. GCP Data Engineer Certification is highly preferred. Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Total Experience: Relevant Experience : Current Role / Skillset: Current CTC: Fixed: Variables(if any): Bonus(if any): Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: Willing to work from office: ************* 5DAYS WORK FROM OFFICE MANDATORY ****************

Posted 2 days ago

Apply

4.0 - 9.0 years

14 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Job Title: GCP Data Engineer Overview: We are looking for a skilled GCP Data Engineer with 4 to 5 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities: Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and implement data and semantic interoperability specifications. Work closely with business teams to define and scope requirements. Analyze existing systems to identify appropriate data sources and drive continuous improvement. Implement and continuously enhance automation processes for data ingestion and data transformation. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications: Overall 4-5 years of hands-on experience as a Data Engineer, with at least 3 years of direct GCP Data Engineering experience . Strong SQL and Python development skills are mandatory. Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies. Demonstrated knowledge and experience with Google Cloud BigQuery is a must. Experience with DataProc and Dataflow is highly preferred. Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks . Extensive experience in SQL across various database platforms. Experience in data mapping and data modeling . Familiarity with data analytics tools and best practices. Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell . Practical experience with Google Cloud services including but not limited to: Big Query , BigTable Cloud Dataflow , Cloud Data proc Cloud Storage , Pub/Sub Cloud Functions , Cloud Composer Cloud Spanner , Cloud SQL Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark ). Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker , etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. GCP Data Engineer Certification is highly preferred. Interested Candidates share updated cv to mail id dikshith.nalapatla@motivitylabs.com Total Experience: Relevant Experience : Current Role / Skillset: Current CTC: Fixed: Variables(if any): Bonus(if any): Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: Willing to work from office: ************* 5DAYS WORK FROM OFFICE MANDATORY ****************

Posted 3 days ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Primary skills- GCP, Python CODING MUST, SQL Coding skills, Big Query, Dataflow, Airflow, Kafka and Airflow Dags . Bachelors Degree or equivalent experience in Computer Science or related field Required- Immediate or 15 days Job Description 3+ years experience as a software engineer or equivalent designing large data-heavy distributed systems and/or high-traffic web-apps Experience in at least one programming language (Python-2 yrs strong coding is must) or java. Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies Experience designing & interacting with APIs (REST/GraphQL) Experience working with cloud platforms such as GCP, Big Query Experience in DevOps processes/tooling (CI/CD, GitHub Actions), using version control systems (Git strongly preferred), and working in a remote software development environment Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment Preferred: Experience using infrastructure as code frameworks (Terraform) Preferred: Experience using big data tools such as Spark/PySpark Preferred: Experience using or deploying MLOps systems/tooling (eg. MLFlow) Must have: Experience in pipeline orchestration (eg. Airflow) Must Have Experience in Data Flow 1 yr experience atleast Preferred: Experience using infrastructure as code frameworks (Terraform) Preferred: Experience in an additional programming language (JavaScript, Java, etc) Preferred: Experience using data science/machine learning technologies.

Posted 6 days ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Faridabad

Work from Office

Naukri logo

Job Summary We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities Design and implement scalable data models using Snowflake and Erwin Data Modeler. Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. Ensure performance tuning, security, and optimization of the Snowflake data warehouse. Document metadata, data lineage, and business logic behind data structures and flows. Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills Snowflake architecture, schema design, and data warehouse experience. DBT (Data Build Tool) for data transformation and pipeline development. Strong expertise in SQL (query optimization, complex joins, window functions, etc.). Hands-on experience with Erwin Data Modeler (logical and physical modeling). Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have Experience with CI/CD tools and DevOps for data environments. Familiarity with data governance, security, and privacy practices. Exposure to Agile methodologies and working in distributed teams. Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills Excellent problem-solving and analytical skills. Strong communication and stakeholder management. Self-driven with the ability to work independently in a remote setup. Skills: gcp,erwin,dbt,sql,data modeling,dbeaver,bigquery,query optimization,dataflow,cloud storage,snowflake,erwin data modeler,data pipelines,data transformation,datamodeler

Posted 6 days ago

Apply

4.0 - 9.0 years

0 - 2 Lacs

Chennai

Hybrid

Naukri logo

Job Description: We are seeking a skilled and proactive GCP Data Engineer with strong experience in Python and SQL to build and manage scalable data pipelines on Google Cloud Platform (GCP) . The ideal candidate will work closely with data analysts, architects, and business teams to enable data-driven decision-making. Key Responsibilities: Design and develop robust data pipelines and ETL/ELT processes using GCP services Write efficient Python scripts for data processing, transformation, and automation Develop complex SQL queries for data extraction, aggregation, and analysis Work with tools like BigQuery, Cloud Storage, Cloud Functions , and Pub/Sub Ensure high data quality, integrity, and governance across datasets Optimize data workflows for performance and scalability Collaborate with cross-functional teams to define and deliver data solutions Monitor, troubleshoot, and resolve issues in data workflows and pipelines Required Skills: Hands-on experience with Google Cloud Platform (GCP) Strong programming skills in Python for data engineering tasks Advanced proficiency in SQL for working with large datasets Experience with BigQuery , Cloud Storage , and Cloud Functions Familiarity with streaming and batch processing (e.g., Pub/Sub , Dataflow , or Dataproc )

Posted 6 days ago

Apply

3.0 - 6.0 years

5 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary: We are seeking a highly skilled MLOps Engineer to design, deploy, and manage machine learning pipelines in Google Cloud Platform (GCP). In this role, you will be responsible for automating ML workflows, optimizing model deployment, ensuring model reliability, and implementing CI/CD pipelines for ML systems. You will work with Vertex AI, Kubernetes (GKE), BigQuery, and Terraform to build scalable and cost-efficient ML infrastructure. The ideal candidate must have a good understanding of ML algorithms, experience in model monitoring, performance optimization, Looker dashboards and infrastructure as code (IaC), ensuring ML models are production-ready, reliable, and continuously improving. You will be interacting with multiple technical teams, including architects and business stakeholders to develop state of the art machine learning systems that create value for the business. Responsibilities: Managing the deployment and maintenance of machine learning models in production environments and ensuring seamless integration with existing systems. Monitoring model performance using metrics such as accuracy, precision, recall, and F1 score, and addressing issues like performance degradation, drift, or bias. Troubleshoot and resolve problems, maintain documentation, and manage model versions for audit and rollback. Analyzing monitoring data to preemptively identify potential issues and providing regular performance reports to stakeholders. Optimization of the queries and pipelines. Modernization of the applications whenever required Qualifications: Expertise in programming languages like Python, SQL Solid understanding of best MLOps practices and concepts for deploying enterprise level ML systems. Understanding of Machine Learning concepts, models and algorithms including traditional regression, clustering models and neural networks (including deep learning, transformers, etc.) Understanding of model evaluation metrics, model monitoring tools and practices. Experienced with GCP tools like BigQueryML, MLOPS, Vertex AI Pipelines (Kubeflow Pipelines on GCP), Model Versioning & Registry, Cloud Monitoring, Kubernetes, etc. Solid oral and written communication skills and ability to prepare detailed technical documentation of new and existing applications. Strong ownership and collaborative qualities in their domain. Takes initiative to identify and drive opportunities for improvement and process streamlining. Bachelors Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications: Experience in Azure MLOPS, Familiarity with Cloud Billing. Experience in setting up or supporting NLP, Gen AI, LLM applications with MLOps features. Experience working in an Agile environment, understanding of Lean Agile principles.

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

We are looking for an experienced GCP and BigQuery professional to join our team in India. The ideal candidate will have a solid background in data engineering and analytics, with expertise in designing scalable data solutions on the Google Cloud Platform. Responsibilities Design, develop and maintain scalable data pipelines using Google Cloud Platform (GCP) and BigQuery. Analyze and interpret complex datasets to provide actionable insights to stakeholders. Collaborate with data engineers and analysts to optimize data storage and retrieval processes. Implement data quality checks and ensure the accuracy of data in BigQuery. Create and manage dashboards and reports to visualize data findings effectively. Stay up-to-date with the latest developments in GCP and BigQuery to leverage new features for business needs. Skills and Qualifications 6-10 years of experience in data engineering or analytics with a focus on Google Cloud Platform (GCP) and BigQuery. Strong proficiency in SQL and experience with BigQuery optimizations. Experience with ETL tools and data pipeline orchestration (e.g., Apache Airflow, Cloud Dataflow). Familiarity with programming languages such as Python or Java for data manipulation and analysis. Knowledge of data modeling, data warehousing concepts, and best practices. Understanding of data privacy and security standards in cloud environments.

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 10 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Google Cloud Platform GCS, DataProc, Big Query, Data Flow Programming Languages Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform, GCS, DataProc, Big Query, Data Flow, Composer, Data Processing, Java

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Gurugram

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, test, deploy, and maintain ETL processes using SSIS to extract data from various sources. Develop complex SQL queries to retrieve data from relational databases such as SQL Server. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Troubleshoot issues related to ETL process failures or performance problems. Ensure compliance with security standards by implementing Denodo Platform for data masking. Desired Candidate Profile 4-9 years of experience in ETL development with expertise in Agile methodology. Strong understanding of .NET Core, C#, Microsoft Azure, Big Query, SSRS (SQL Reporting Services), SSIS (SQL Server Integration Services). B.Tech/B.E. degree in Any Specialization. Hands-on experience in Database (MS SQL Server, Big Query, Denodo) Experience in NET / Visual Studio (SSRS ,SSIS & ETLs Package) Good Knowledge in requirement elicitation From workshops/meetings to Agile board epic/features/stories

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

As a GCP data engineer the colleague should be able to designs scalable data architectures on Google Cloud Platform, using services like Big Query and Dataflow. They write and maintain code (Python, Java), ensuring efficient data models and seamless ETL processes. Quality checks and governance are implemented to maintain accurate and reliable data. Security is a priority, enforcing measures for storage, transmission, and processing, while ensuring compliance with data protection standards. Collaboration with cross-functional teams is key for understanding diverse data requirements. Comprehensive documentation is maintained for data processes, pipelines, and architectures. Responsibilities extend to optimizing data pipelines and queries for performance, troubleshooting issues, and proactively monitoring data accuracy. Continuous learning is emphasized to stay updated on GCP features and industry best practices, ensuring a current and effective data engineering approach. Experience - Proficiency in programming languages: Python, Pyspark - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud Composer , Cloud Spanner, GCS, DBT etc., - Data Engineering skillset using Python, SQL - Experience in ETL (Extract, Transform, Load) processes - Knowledge of DevOps tools like Jenkins, GitHub, Terraform is desirable. Should have good knowledge on Kafka (Batch/ streaming) - Understanding of Data models and experience in performing ETL design and build, database replication using Message based CDC - Familiarity with cloud storage solutions - Strong problem-solving abilities in data engineering challenges - Understanding of data security and scalability - Proficiency in relevant tools like Apache Airflow Desirables Knowledge of data modelling and database design Good understanding of Cloud Security Proven practical experience of using the Google Cloud SDK to deliver APIs and automation Crafting continuous integration and continuous delivery/deployment tooling pipelines (Jenkins/Spinnaker)

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Qualification & Experience: Minimum of 8 years of experience as a Data Scientist/Engineer with demonstrated expertise in data engineering and cloud computing technologies. Technical Responsibilities Excellent proficiency in Python, with a strong focus on developing advanced skills. Extensive exposure to NLP and image processing concepts. Proficient in version control systems like Git. In-depth understanding of Azure deployments. Expertise in OCR, ML model training, and transfer learning. Experience working with unstructured data formats such as PDFs, DOCX, and images. O Strong familiarity with data science best practices and the ML lifecycle. Strong experience with data pipeline development, ETL processes, and data engineering tools such as Apache Airflow, PySpark, or Databricks. Familiarity with cloud computing platforms like Azure, AWS, or GCP, including services like Azure Data Factory, S3, Lambda, and BigQuery. Tool Exposure: Advanced understanding and hands-on experience with Git, Azure, Python, R programming and data engineering tools such as Snowflake, Databricks, or PySpark. Data mining, cleaning and engineering: Leading the identification and merging of relevant data sources, ensuring data quality, and resolving data inconsistencies. Cloud Solutions Architecture: Designing and deploying scalable data engineering workflows on cloud platforms such as Azure, AWS, or GCP. Data Analysis : Executing complex analyses against business requirements using appropriate tools and technologies. Software Development : Leading the development of reusable, version-controlled code under minimal supervision. Big Data Processing : Developing solutions to handle large-scale data processing using tools like Hadoop, Spark, or Databricks. Principal Duties & Key Responsibilities: Leading data extraction from multiple sources, including PDFs, images, databases, and APIs. Driving optical character recognition (OCR) processes to digitize data from images. Applying advanced natural language processing (NLP) techniques to understand complex data. Developing and implementing highly accurate statistical models and data engineering pipelines to support critical business decisions and continuously monitor their performance. Designing and managing scalable cloud-based data architectures using Azure, AWS, or GCP services. Collaborating closely with business domain experts to identify and drive key business value drivers. Documenting model design choices, algorithm selection processes, and dependencies. Effectively collaborating in cross-functional teams within the CoE and across the organization. Proactively seeking opportunities to contribute beyond assigned tasks. Required Competencies: Exceptional communication and interpersonal skills. Proficiency in Microsoft Office 365 applications. Ability to work independently, demonstrate initiative, and provide strategic guidance. Strong networking, communication, and people skills. Outstanding organizational skills with the ability to work independently and as part of a team. Excellent technical writing skills. Effective problem-solving abilities. Flexibility and adaptability to work flexible hours as required. Key competencies / Values: Client Focus : Tailoring skills and understanding client needs to deliver exceptional results. Excellence : Striving for excellence defined by clients, delivering high-quality work. Trust : Building and retaining trust with clients, colleagues, and partners. Teamwork : Collaborating effectively to achieve collective success. Responsibility : Taking ownership of performance and safety, ensuring accountability. People : Creating an inclusive environment that fosters individual growth and development.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Technical and Professional Requirements: Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->GCP Data Analytics->Looker Technology->Cloud Platform->GCP Database->Google BigQuery

Posted 1 week ago

Apply

3.0 - 6.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

- Minimum of 3 years of hands-on experience. - Python/ML, Hadoop, Spark : Minimum of 2 years of experience. - At least 3 years of prior experience as a Data Analyst. - Detail-oriented with a structured thinking and analytical mindset. - Proven analytic skills, including data analysis, data validation, and technical writing. - Strong proficiency in SQL and Excel. - Experience with Big Query is mandatory. - Knowledge of Python and machine learning algorithms is a plus. - Excellent communication skills with the ability to be precise and clear. - Learning Ability : Ability to quickly learn and adapt to new analytic tools and technologies. Key Responsibilities : Data Analysis : - Perform comprehensive data analysis using SQL, Excel, and Big Query. - Validate data integrity and ensure accuracy across datasets. - Develop detailed reports and dashboards that provide actionable insights. - Create and deliver presentations to stakeholders with clear and concise findings. - Document queries, reports, and analytical processes clearly and accurately. - Leverage Python/ML for advanced data analysis and model development. - Utilize Hadoop and Spark for handling and processing large datasets. - Work closely with cross-functional teams to understand data requirements and provide analytical support. - Communicate findings effectively and offer recommendations based on data analysis. Education : Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Experience : Minimum of 3 years of experience as a Data Analyst with a strong focus on SQL, Excel, and Big Query. Technical Skills : Proficiency in SQL, Excel, and Big Query; experience with Python, ML, Hadoop, and Spark is preferred.

Posted 1 week ago

Apply

8.0 - 10.0 years

20 - 30 Lacs

Chennai

Hybrid

Naukri logo

Role & responsibilities GCP Services - Biq Query, Data Flow, Dataproc, DataPlex, DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years in Data Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strong experience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobs for data importing/exporting Preferred candidate profile Python, dataflow, Dataproc, GCP Cloud Run, DataForm, Agile Software Development, Big Query, TERRAFORM, Data Fusion, Cloud SQL, GCP, KAFKA,Java. We would like to inform you that only immediate joiners will be considered for this position due to project urgency.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

Experience of 7+ years and have hands on experience on JAVA , GCP Shell script and Python knowledge a plus. Have in depth knowledge on Java, Spring boot Experience in GCP Data Flow, Big Table, Big Query etc

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

Data Warehousing, Python, Big Query, SQL, AIRFLOW, GCP Skills Preferred: Analytical, Problem Solving Experience Required: 5+ years of experience in Datawarehousing, Atleast 2 years f experience in GCP Big Query able to build complex sqls in Big query Required Candidate profile Must: Python, Data warehousing and Big query Exp– 6 yrs Candidate needs to take HackerRank Test (First level) Hybrid (11 Days work from office) Chennai Immediate to 15 days CTC upto 22 LPA

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

GCP data engineer Big Query SQL Python Talend ETL Programmer GCP or Any Cloud technology. Job Description: Experienced in GCP data engineer Big Query SQL Python Talend ETL Programmer GCP or Any Cloud technology. Good experience in building the pipeline of GCP Components to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently

Posted 2 weeks ago

Apply

4.0 - 8.0 years

13 - 17 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Responsibilities: Design and architect enterprise-scale data platforms, integrating diverse data sources and tools Develop real-time and batch data pipelines to support analytics and machine learning Define and enforce data governance strategies to ensure security, integrity, and compliance along with optimizing data pipelines for high performance, scalability, and cost efficiency in cloud environments Implement solutions for real-time streaming data (Kafka, AWS Kinesis, Apache Flink) and adopt DevOps/DataOps best practices Required Skills: Strong experience in designing scalable, distributed data systems and programming (Python, Scala, Java) with expertise in Apache Spark, Hadoop, Flink, Kafka, and cloud platforms (AWS, Azure, GCP) Proficient in data modeling, governance, warehousing (Snowflake, Redshift, Big Query), and security/compliance standards (GDPR, HIPAA) Hands-on experience with CI/CD (Terraform, Cloud Formation, Airflow, Kubernetes) and data infrastructure optimization (Prometheus, Grafana) Nice to Have: Experience with graph databases, machine learning pipeline integration, real-time analytics, and IoT solutions Contributions to open-source data engineering communities

Posted 3 weeks ago

Apply

12.0 - 15.0 years

40 - 45 Lacs

Chennai

Work from Office

Naukri logo

Skill & Experience Strategic Planning and Direction Maintain architecture principles, guidelines and standards Project & Program Management Data Warehousing Big Data Data Analytics &; Data Science for solutioning Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Experience in designing & mplementing solution in mentioned areas:Strong Google Cloud Platform Data Components BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat In your role, you will be responsible for: Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies: OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

3.0 - 5.0 years

10 - 13 Lacs

Chennai

Work from Office

Naukri logo

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

6.0 - 8.0 years

18 - 20 Lacs

Chennai

Work from Office

Naukri logo

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Chennai

Work from Office

Naukri logo

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,DataProc,Big Query,Data Flow,Composer,Data Processing,Java*

Posted 3 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

What youll be doing... We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a Senior and Lead data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a subject matter expert of solutions & platforms, you will be responsible for providing technical leadership to various projects on the data platform team. You are expected to have depth of knowledge on specified technological areas, which includes knowledge of applicable processes, methodologies, standards, products and frameworks. Driving the technical design of large-scale data platforms, utilizing modern and open-source technologies, in a hybrid cloud environment Setting standards for data engineering functions; design templates for the data management program which are scalable, repeatable, and simple. Building strong multi-functional relationships and get recognized as a data and analytics subject matter expert among other teams. Collaborating across teams to settle appropriate data sources, develop data extraction and business rule solutions. Sharing and incorporating best practices from the industry using new and upcoming tools and technologies in data management & analytics. Organizing, planning and developing solutions to sophisticated data management problem statements. Defining and documenting architecture, capturing and documenting non - functional (architectural) requirements, preparing estimates and defining technical solutions to proposals (RFPs). Designing & developing reusable and scalable data models to suit business deliverables Designing & developing data pipelines. Providing technical leadership to the project team to perform design to deployment related activities, provide guidance, perform reviews, prevent and resolve technical issues. Collaborating with the engineering, DevOps & admin team to ensure alignment to efficient design practices, and fix issues in dev, test and production environments from infrastructure is highly available and performing as expected. Designing, implementing, and deploying high-performance, custom solutions. Where you'll be working: In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What were looking for... You are curious and passionate about Data and truly believe in the high impact it can create for the business. People count on you for your expertise in data management in all phases of the software development cycle. You enjoy the challenge of solving complex data management problems and challenging priorities in a multifaceted, complex and deadline-oriented environment. Building effective working relationships and collaborating with other technical teams across the organization comes naturally to you. You'll need to have: Bachelors degree or four or more years of work experience. Six or more years of relevant work experience. Knowledge of Information Systems and their applications to data management processes. Experience performing detailed analysis of business problems and technical environments and designing the solution. Experience working with Google Cloud Platform & BigQuery. Experience working with Bigdata Technologies & Utilities - Hadoop/ Spark/ Scala/ Kafka/ NiFi. Experience with relational SQL and NoSQL databases. Experience with data pipeline and workflow management & Governance tools. Experience with stream-processing systems. Experience with object-oriented/object function scripting languages. Experience building data solutions for Machine learning and Artificial Intelligence. Knowledge of Data Analytics and modeling tools. Even better if you have any one or more of the following: Knowledge of Telecom and Network Masters degree in Computer Science or a related field. Contributed to Open-Source Data Warehousing. Certifications in any Data Warehousing/Analytical solutioning. Certifications in GCP. Ability to clearly articulate the pros and cons of various technologies and platforms. Experience collaborating with multi-functional teams and managing partner expectations. Written and verbal communication skills. Ability to work in a fast-paced agile development environment.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies