Jobs
Interviews

1192 Bigquery Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

13 - 16 Lacs

Chennai

Work from Office

Key Responsibilities: Develop and maintain scalable full-stack applications using Java, Spring Boot, and Angular for building rich UI screens and custom/reusable components. Design and implement cloud-based solutions leveraging Google Cloud Platform (GCP) services such as BigQuery, Google Cloud Storage, Cloud Run, and PubSub. Manage and optimize CI/CD pipelines using Tekton to ensure smooth and efficient development workflows. Deploy and manage Google Cloud services using Terraform, ensuring infrastructure as code principles. Mentor and guide junior software engineers, fostering professional development and promoting systemic change across the development team. Collaborate with cross-functional teams to design, build, and maintain efficient, reusable, and reliable code. Drive best practices and improvements in software engineering processes, including coding standards, testing, and deployment strategies. Required Skills: Java/Spring Boot (5+ years): In-depth experience in developing backend services and APIs using Java and Spring Boot. Angular (3+ years): Proven ability to build rich, dynamic user interfaces and custom/reusable components using Angular. Google Cloud Platform (2+ years): Hands-on experience with GCP services like BigQuery, Google Cloud Storage, Cloud Run, and PubSub. CI/CD Pipelines (2+ years): Experience with tools like Tekton for automating build and deployment processes. Terraform (1-2 years): Experience in deploying and managing GCP services using Terraform. J2EE (5+ years): Strong experience in Java Enterprise Edition for building large-scale applications. Experience mentoring and delivering organizational change within a software development team.

Posted 2 months ago

Apply

4.0 - 7.0 years

8 - 14 Lacs

Noida

Hybrid

Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time

Posted 2 months ago

Apply

2.0 - 4.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Role Finance Controller Lead DO Lead cross global functional teams in developing finance strategies to support a strategic alignment with companys Business Operations, and Corporate departments on company goals & initiatives. Manage financial goals that result in strong customer satisfaction, align with company strategy, and optimize costs and supplier relations. Influence senior leaders in setting direction for their functional areas by linking finance and business strategies to optimize business results.

Posted 2 months ago

Apply

1.0 - 4.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Job Area: Information Technology Group, Information Technology Group > IT Data Engineer General Summary: We are looking for a savvy Data Engineer expert to join our analytics team. The Candidate will be responsible for expanding and optimizing our data and data pipelines, as well as optimizing data flow and collection for cross functional teams. The ideal candidate has python development experience and is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. We believe that candidate with solid Software Engineering/Development is a great fit. However, we also recognize that each candidate has a unique blend of skills. The Data Engineer will work with database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams. The right candidate will be excited by the prospect of optimizing data to support our next generation of products and data initiatives.Responsibilities for Data Engineer Create and maintain optimal data pipelines, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing for greater scalability, etc. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Work with data and analytics experts to strive for greater functionality in our data systems. Performing ad hoc analysis and report QA testing. Follow Agile/SCRUM development methodologies within Analytics projects. Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Good communication skills, a great team player and someone who has the hunger to learn newer ways of problem solving. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. A successful history of manipulating, processing, and extracting value from large, disconnected datasets. Working knowledge on Unix or Shell scripting Constructing methods to test user acceptance and usage of data. Knowledge of predictive analytics tools and problem solving using statistical methods is a plus. Experience supporting and working with cross-functional teams in a dynamic environment. Demonstrated understanding of the Software Development Life Cycle Ability to work independently and with a team in a diverse, fast paced, and collaborative environment Excellent written and verbal communication skills A quick learner with the ability to handle development tasks with minimum or no supervision Ability to multitask We are looking for a candidate with 7+ years of experience in a Data Engineering role. They should also have experience using the following software/tools Experience in Python, Java, etc. Experience with Google Cloud Platform. Experience with bigdata frameworks & tools - Apache Hadoop/Beam/Spark/Kafka. Exposure to workflow management & scheduling using Airflow/Prefect/Dagster Exposure to databases like (Big Query , Clickhouse). Experience to container orchestration (Kubernetes) Optional Experience on one or more BI tools (Tableau, Splunk or equivalent).. Minimum Qualifications:6+ years of IT-related work experience without a Bachelors degree. 2+ years of work experience with programming (e.g., Java, Python). 1+ year of work experience with SQL or NoSQL Databases. 1+ year of work experience with Data Structures and algorithms.'Bachelor's degree and 7+ years Data Engineer/ Software Engineer (Data) Experience Minimum Qualifications: 4+ years of IT-related work experience with a Bachelor's degree in Computer Engineering, Computer Science, Information Systems or a related field. OR 6+ years of IT-related work experience without a Bachelors degree. 2+ years of work experience with programming (e.g., Java, Python). 1+ year of work experience with SQL or NoSQL Databases. 1+ year of work experience with Data Structures and algorithms. Bachelors / Masters or equivalent degree in computer engineering or in equivalent stream Applicants Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.

Posted 2 months ago

Apply

0.0 - 3.0 years

5 - 9 Lacs

Mumbai

Work from Office

Job Title:Sub Regional Manager Experience0-3 Years Location:Mumbai : RESPONSIBILITES Management responsibilities The SRM will manage a team of 2 to 4 junior Trainer and Product Specialists. Responsibilities would include: Manage a team of 2 to 4 junior Trainer and Product Specialists, coordinating their activities, scheduling and verifying their visits, and ensuring timely and accurate reporting of activities. Mentoring newly recruited trainers, developing their product knowledge, and facilitating their growth as trainers. Making visits to not only schools directly handled by you, but periodic visits to schools handled by the trainers you manage. The purpose of these visits are varied and include quality control, relationship building, or ongoing training. School support responsibilities Deliver training on theprogram delivery and methodology to the teachers ofa select number of high-profile/significant schoolswhich have adopted the program. Conduct regular support visits to the assigned schools, monitor sessions, and provide feedback for improvement to the government or school management. Manage the schools delivery and effectiveness of program in the geography assigned and ensure positive feedback. Build relationship and maintain good rapport with the government department and functionaries. Other responsibilities Support the Sales team in making presentations to teachers, educators, and school decision makers to influence them to adopt path-breaking practices. Generate timely project reports and documents, ensuring effective communication between the company and the respective government or school partner(s). Coordinate activities such as impact assessment processes and other product related research. QUALIFICATIONS The ideal candidate would have A Masters in social work and/or strong background in teaching and education, experience in the social or development sector, or experience in soft-skills training. Excellent communication and presentation skills. Excellent data management skills. The SRM would need to effectively manage information for anywhere between 50 and 200 schools. Prior experience of managing a team (of any size). Strong fluency in English and a regional language. Experience in project management and coordination preferred. Due to the nature of the work, the applicant must be willing to travel extensively. The cost of travel and accommodation will be reimbursed as per company’s HR and Finance policy. KPEC is a fast growing enterprise and candidates who demonstrate passion and capabilities can expect substantial growth opportunities.

Posted 2 months ago

Apply

5.0 - 10.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Job Title:Data Engineer Experience5-10 Years Location:Bangalore : Data Engineers with PySpark and AWS Glue experiences. AWS mandatory. GCP and Azure add-on Proven experience as a Data Engineer or similar role in data architecture, database management, and cloud technologies. Proficiency in programming languages such as Python, Java, or Scala. Strong experience with data processing frameworks like PYSpark, Apache Kafka, or Hadoop. Hands-on experience with data warehousing solutions such as Redshift, BigQuery, Snowflake, or similar platforms. Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL, etc.). Experience with version control tools like Git. Familiarity with containerization and orchestration tools like Docker, Kubernetes, and Airflow is a plus. Strong problem-solving skills, analytical thinking, and attention to detail. Excellent communication skills and ability to collaborate with cross-functional teams. Certifications Needed Bachelor's or master’s degree in Computer Science, Information Systems, Engineering or equivalent.

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Hybrid

Duration: 8Months Work Type: Onsite Position Description: Looking for qualified Data Scientists who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, and Optimization. Potential candidates should have hands-on experience in applying first principles methods, machine learning, data mining, and text mining techniques to build analytics prototypes that work on massive datasets. Candidates should have experience in manipulating both structured and unstructured data in various formats, sizes, and storage-mechanisms. Candidates should have excellent problem-solving skills with an inquisitive mind to challenge existing practices. Candidates should have exposure to multiple programming languages and analytical tools and be flexible to using the requisite tools/languages for the problem at-hand. Skills Required: Machine Learning, GenAI, LLM Skills Preferred: Python, Google Cloud Platform, Big Query Experience Required: 3+ years of hands-on experience in using machine learning/text mining tools and techniques such as Clustering/classification/decision trees, Random forests, Support vector machines, Deep Learning, Neural networks, Reinforcement learning, and other numerical algorithms Experience Preferred: 3+ years of experience in at least one of the following languages: Python, R, MATLAB, SAS Experience with GoogleCloud Platform (GCP) including VertexAI, BigQuery, DBT, NoSQL database and Hadoop Ecosystem Education Required: Bachelor's Degree

Posted 2 months ago

Apply

3.0 - 5.0 years

10 - 14 Lacs

Hyderabad

Work from Office

You will be responsible for the uptime, performance, and operational cost of a large-scale cloud platform or some of our SaaS-based products. You will make daily and weekly operational decisions with the goal of improving uptime while reducing costs. You will drive improvements by gaining an in-depth knowledge of the products in your responsibility and applying the latest emerging trends in the cloud and SaaS technologies. All your decisions will be focused on providing the best in class service to the users of our SaaS products. Our organization relies on its central engineering workforce to develop and maintain a product portfolio of several different startups. As part of our engineering, you'll get to work on several different products every quarter. Our product portfolio continuously grows as we incubate more startups, which means that different products are very likely to make use of different technologies, architecture & frameworks - a fun place for smart tech lovers! Candidate Requirements 3 to 5 years of experience working in DevOps. In-depth knowledge of configuring and hosting services on Kubernetes. Hands-on experience in configuring and managing a service mesh like Istio. Experience working in production environment, AWS, Cloud, Agile, CI/CD, and DevOps environments. We live in the Cloud Experience in Jenkins, Google Cloud Build, or similar Good to have experience with using PAAS and SAAS services from AWS/Azure/GCP like BigQuery,Cloud Storage, S3, etc. Good to have experience with configuring, scaling, and monitoring database systems li PostgreSQL, MySQL, MongoDB, and so on.

Posted 2 months ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Noida

Work from Office

Location: Noida (In-office/Hybrid; client site if required) Type: Full-Time | Immediate Joiners Preferred Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, Cloud Storage) PySpark / Spark Distributed computing expertise Apache Iceberg (preferred), Hudi, or Delta Lake Role Overview: Be part of a high-impact Data Engineering team focused on building scalable, cloud-native data pipelines. You'll support and enhance EMR platforms using DevOps principles, helping deliver real-time health alerts and diagnostics for platform performance. Key Responsibilities: Provide data engineering support to EMR platforms Design and implement cloud-native, automated data solutions Collaborate with internal teams to deliver scalable systems Continuously improve infrastructure reliability and observability Technical Environment: Databases: Oracle, MySQL, MSSQL, MongoDB Distributed Engines: Spark/PySpark, Presto, Flink/Beam Cloud Infra: GCP (preferred), AWS (nice-to-have), Terraform Big Data Formats: Iceberg, Hudi, Delta Tools: SQL, Data Modeling, Palantir Foundry, Jenkins, Confluence Bonus: Stats/math tools (NumPy, PyMC3), Linux scripting Ideal for engineers with cloud-native, real-time data platform experience especially those who have worked with EMR and modern lakehouse stacks.

Posted 2 months ago

Apply

12.0 - 16.0 years

18 - 25 Lacs

Hyderabad

Remote

JD for Fullstack Developer. Exp: 10+yrs Front End Development Design and implement intuitive and responsive user interfaces using React.js or similar front-end technologies. Collaborate with stakeholders to create a seamless user experience. Create mockups and UI prototypes for quick turnaround using Figma, Canva, or similar tools. Strong proficiency in HTML, CSS, JavaScript, and React.js. Experience with styling and graph libraries such as Highcharts, Material UI, and Tailwind CSS. Solid understanding of React fundamentals, including Routing, Virtual DOM, and Higher-Order Components (HOC). Knowledge of REST API integration. Understanding of Node.js is a big advantage. Middleware Development Experience with REST API development, preferably using FastAPI. Proficiency in programming languages like Python. Integrate APIs and services between front-end and back-end systems. Experience with Docker and containerized applications. Back End Development Experience with orchestration tools such as Apache Airflow or similar. Design, develop, and manage simple data pipelines using Databricks, PySpark, and Google BigQuery. Medium-level expertise in SQL. Basic understanding of authentication methods such as JWT and OAuth. Bonus Skills Experience working with cloud platforms such as AWS, GCP, or Azure. Familiarity with Google BigQuery and Google APIs. Hands-on experience with Kubernetes for container orchestration. Contact : Sandeep Nunna Ph No : 9493883212 Email : sandeep.nunna@clonutsolutions,com

Posted 2 months ago

Apply

5.0 - 6.0 years

20 - 25 Lacs

Pune

Work from Office

Tableau Position Mandatory Skills : Strong Tableau Dashboard Creation and SQL skills Desired Skills : Worked on Datawarehouse like BigQuery, Ad Tech Domain knowledge Work Location : Pune Experience : 5-6 years of Tableau Dashboard creation Job Responsibilities : Understand requirements for Dashboards and creating effective dashboards in Tableau as per clients UX standards. Strong in SQL and Datawarehouse concepts. Experience in creating extracts and worked on Tableau server. Data Understanding: Collaborate with business stakeholders to understand their data requirements and reporting needs. Analyze complex datasets to identify key insights and trends. Data Preparation: Cleanse, transform, and prepare data for visualization. Use SQL queries to extract and manipulate data from various sources, such as databases, data warehouses, and cloud platforms. Dashboard Creation: Design and develop visually appealing and interactive dashboards using Tableau's drag-and-drop interface. Create custom calculations, parameters, and filters to provide dynamic insights. Format dashboards to adhere to client branding and UX standards. Data Storytelling: Present data in a clear and compelling manner, using charts, graphs, and other visualization techniques. Tailor dashboards to specific audiences, adjusting the level of detail and complexity. Performance Optimization: Optimize dashboard performance to ensure fast loading times and smooth interactions. Implement best practices for data extraction, calculation, and visualization. Collaboration: Work closely with data analysts, data engineers, and business users to deliver timely and accurate insights. Provide technical support and training to end-users.

Posted 2 months ago

Apply

3.0 - 5.0 years

20 - 25 Lacs

Pune

Work from Office

Mandatory skills are 1)SQL (Expert) 2)Datawarhouse / ETL Background. 3)DBT (Intermediate). Should be able to learn quickly. Desirable skills are 1)BigQuery 2)Github Work Location : Pune Employment Type: Contract (6 months or more) Notice Period: Immediate to 15 days Experience : 3-4 years relevant exp. Roles and Responsibilities Understanding client requirements and creating models using DBT on the BigQuery on GCP Development of new functionality and maintenance of existing functionality

Posted 2 months ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Noida

Hybrid

Data Engineer (SaaS-Based) || 5-7 years || NOIDA || 3 pm-12 AM IST shift Location: Noida (In-office/Hybrid; Client site if required) Experience: 5-7 years Type: Full-Time | Immediate Joiners Preferred Shift: 3 PM to 12 AM IST Client: Leading Canadian-based Tech Company Good to have: GCP Certified Data Engineer Overview of the role: As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape. Required Skills: 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. Extensive experience in doing requirement discovery, analysis and data pipeline solution design. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. Work closely with analysts and business process owners to translate business requirements into technical solutions. Coding experience in scripting and languages (Python, SQL, PySpark). Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM). Exposure of Google Dataproc and Dataflow. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker Experience with SAS/SQL Server/SSIS is an added advantage. Qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred) Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences. Job Type: Full-time

Posted 2 months ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Were Hiring: Senior GCP Data Engineer (7+ Years Experience) Location: Hyderabad (Work from Office - Mandatory) Apply Now: sasidhar.m@technogenindia.com Are you a passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and strong hands-on experience in data migration projects ? Do you bring solid knowledge of Oracle to the table and thrive in a fast-paced, collaborative environment? TechnoGen India is looking for a Senior GCP Data Engineer to join our Hyderabad team. This is a full-time, on-site opportunity designed for professionals ready to take on challenging migration projects and deliver impactful solutions. What We’re Looking For: 7+ years of experience in Data Engineering Strong expertise in GCP (BigQuery, Dataflow, Pub/Sub, etc.) Proven experience in complex GCP migration projects Solid Oracle background (data extraction, transformation, and optimization) Ability to work full-time from our Hyderabad office If you’re ready to bring your skills to a growing team that values innovation and excellence, we want to hear from you ! Best Regards, Sasidhar M | Sr IT Recruiter sasidhar.m@technogenindia.com www.technogenindia.com |

Posted 2 months ago

Apply

5.0 - 9.0 years

8 - 18 Lacs

Bengaluru

Remote

Company - Forbes Advisory Location - Remote Database Engineer No of years of exp: 5+ Years Np - We prefer Immediate joiners or those who should be less than 60 days. Major skill set: Python, Sql, OOPS, AWS RDS and Google BigQuery

Posted 2 months ago

Apply

4.0 - 6.0 years

4 - 7 Lacs

Pune

Work from Office

Job Summary We are looking for a Data Quality Engineer who will safeguard the integrity of our cloud-native data assets. You will design and execute automated and manual data-quality checks across structured and semi-structured sources on Azure and GCP, validating that our data pipelines deliver accurate, complete, and consistent datasets for analytics, reporting, and AI initiatives. Key Responsibilities Define, build, and maintain data-quality frameworks that measure accuracy, completeness, timeliness, consistency, and validity of data ingested through ETL/ELT pipelines. Develop automated tests using SQL, Python, or similar tools; supplement with targeted manual validation where required. Collaborate with data engineers to embed data-quality gates into CI/CD pipelines on Azure Data Factory / Synapse / Fabric and GCP Dataflow / Cloud Composer. Profile new data sources (structured and semi-structuredJSON, Parquet, Avro) to establish baselines, detect anomalies, and recommend cleansing or transformation rules. Monitor data-quality KPIs and publish dashboards/alerts that surface issues to stakeholders in near-real time. Conduct root-cause analysis for data-quality defects, propose remediation strategies, and track resolution to closure. Maintain comprehensive documentation of test cases, data-quality rules, lineage, and issue logs for audit and governance purposes. Partner with data governance, security, and compliance teams to ensure adherence to regulatory requirements Must-Have Skills 4-6 years of experience in data quality, data testing, or data engineering roles within cloud environments. Hands-on expertise with at least one major cloud data stackAzure (Data Factory, Synapse, Databricks/Fabric) or GCP (BigQuery, Dataflow, Cloud Composer). Strong SQL skills and proficiency in a scripting language such as Python for building automated validation routines. Solid understanding of data-modeling concepts (dimensional, 3NF, data vault) and how they impact data-quality rules. Experience testing semi-structured data formats (JSON, XML, Avro, Parquet) and streaming/near-real-time pipelines. Excellent analytical and communication skills; able to translate complex data issues into clear, actionable insights for technical and business stakeholders. Nice-to-Have Skills Familiarity with BI/reporting tools (Power BI, Looker, Tableau) for surfacing data-quality metrics. Preferred Certifications Google Professional Data Engineer or Associate Cloud Engineer (GCP track) - OR - Microsoft Certified: Azure Data Engineer Associate Education Bachelors or Masters degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field. Comparable professional experience will also be considered. Why Join Us? You will be the guardian of our datas trustworthiness, enabling decision-makers to rely on insights with confidence. If you are passionate about building automated, scalable data-quality solutions in a modern cloud environment, we’d love to meet you.

Posted 2 months ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Pune

Work from Office

Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.

Posted 2 months ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Required Skills: SQL, GCP (BigQuery, Composer, Data Flow), Big Data (Scala, Kafka) You'll need to have: Experience in Big Data technologies - GCP/Composer/Bigquery /DataFlow Understanding the business requirements and converting them to technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Experience with Data Warehouse concepts and Data Management life cycle.

Posted 2 months ago

Apply

13.0 - 17.0 years

22 - 30 Lacs

Pune

Hybrid

Primary Skills: SQL (Data Analysis and Development) Alternate Skills: Python, Sharepoint, AWS , ETL, Telecom specially Fixed Network domain. Location: Pune Working Persona: Hybrid Experience: 13 to 18 years Core competencies, knowledge and experience: Essential: - Strong SQL experience - Advanced level of SQL - Excellent data interpretation skills - Good knowledge of ETL and a business intelligence, good understanding of range of data manipulation and analysis techniques - Working knowledge of large information technology development projects using methodologies and standards - Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel, should be able to interact with business team and share the ideas. - Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. - Ability to work under pressure to tight deadlines. - Hands on experience working with large datasets. - Able to manage different stakeholders. Good to Have / Alternate Skills: - Strong coding experience in Python. Experience: - In-depth working experience in ETL. C2 General - Fixing problems in cooperation with internal and external partners (e.g. Service owner, Tech. Support Team, IT-Ops) - Designing and implementing the changes to the existing different components of data flow. - Develop & maintain end to end data flow. - Maintaining the data quality, data consistency issues and essential bus business-criticaliness critical processes - Conducting preventative maintenance of the systems - Drive system optimization and simplification - Responsible for performance of data flow and optimisation of the data preparation in conjunction with the other technical team

Posted 2 months ago

Apply

5.0 - 7.0 years

12 - 13 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

We are looking for an experienced Data Engineer with a strong background in data engineering, storage, and cloud technologies. The role involves designing, building, and optimizing scalable data pipelines, ETL/ELT workflows, and data models for efficient analytics and reporting. The ideal candidate must have strong SQL expertise, including complex joins, stored procedures, and certificate-auth-based queries. Experience with NoSQL databases such as Firestore, DynamoDB, or MongoDB is required, along with proficiency in data modeling and warehousing solutions like BigQuery (preferred), Redshift, or Snowflake. The candidate should have hands-on experience working with ETL/ELT pipelines using Airflow, dbt, Kafka, or Spark. Proficiency in scripting languages such as PySpark, Python, or Scala is essential. Strong hands-on experience with Google Cloud Platform (GCP) is a must. Additionally, experience with visualization tools such as Google Looker Studio, LookerML, Power BI, or Tableau is preferred. Good-to-have skills include exposure to Master Data Management (MDM) systems and an interest in Web3 data and blockchain analytics. Location - Remote, Hyderabad,ahmedabad,pune,chennai,kolkata.

Posted 2 months ago

Apply

11.0 - 16.0 years

27 - 32 Lacs

Noida

Work from Office

Responsibilities: - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc.. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI) - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team. Apply Save Save Pro Insights

Posted 2 months ago

Apply

5.0 - 6.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Role GCP Cloud Solutions Engineer (IoT & Development) As a GCP Cloud Solutions Engineer specializing in IoT and Development at Eximietas Design, you will be at the forefront of building and managing robust and scalable cloud infrastructure on the Google Cloud Platform. You will play a critical role in designing, deploying, and optimizing GCP services, with a particular focus on integrating and managing IoT devices and data, as well as supporting development workflows through automation and CI/CD pipelines. You will be responsible for ensuring the security, efficiency, and reliability of our cloud-based solutions. This role requires a strong understanding of GCP services, IoT principles, automation practices, and a passion for building innovative solutions. Key Responsibilities : - Provision, configure, and manage Google Cloud Platform resources, including Compute Engine VM instances, networking components, storage solutions, and security configurations. - Design and implement highly available and fault-tolerant GCP architectures. - Monitor and optimize the performance and cost-effectiveness of GCP resources. - Implement and manage security best practices within the GCP environment. - Design, deploy, and manage BigQuery data warehouses for large-scale data analysis. - Implement and manage Bigtable NoSQL databases for high-throughput, low-latency applications. - Optimize query performance and data storage within BigQuery and Bigtable. - Deploy, manage, and scale containerized applications using Google Kubernetes Engine (GKE). - Implement Kubernetes best practices for orchestration, scaling, and resilience. - Configure and manage networking, storage, and security within GKE clusters. - Design and implement solutions for integrating various IoT devices with the GCP infrastructure. - Utilize GCP IoT Core or other relevant services to manage device connectivity, security, and data ingestion. - Develop data processing pipelines to handle large volumes of IoT data for storage and analysis. - Ensure the security and integrity of IoT device data. - Design and implement CI/CD pipelines using tools like Cloud Build, Jenkins, GitLab CI/CD, or similar for automated application deployment and infrastructure provisioning. - Automate infrastructure provisioning and management tasks using Infrastructure-as-Code (IaC) tools such as Terraform or Cloud Deployment Manager. - Develop and maintain scripts for automation of routine operational tasks. - Implement comprehensive monitoring and logging solutions for GCP services and applications. - Proactively identify and troubleshoot performance bottlenecks and operational issues. - Participate in on-call rotations as needed to ensure system availability. - Collaborate effectively with development teams, data scientists, and other stakeholders to understand their requirements and provide cloud solutions. - Create and maintain clear and concise documentation for cloud infrastructure, configurations, and processes. - Stay up-to-date with the latest GCP services, features, and best practices, particularly in the areas of IoT and development. - Evaluate and recommend new technologies and approaches to improve our cloud infrastructure and processes. Skills & Qualifications : - Proven hands-on experience in deploying, managing, and optimizing core GCP services, including Compute Engine, VPC, Cloud Storage, IAM. - Deep understanding and practical experience with BigQuery for data warehousing and analysis. - Hands-on experience with Bigtable for NoSQL database solutions. - Strong experience in deploying, managing, and scaling applications using Kubernetes (GKE). - Experience in connecting and managing IoT devices with cloud platforms (preferably GCP IoT Core). - Understanding of IoT protocols (e.g., MQTT, CoAP) and data handling at scale. - Proficiency in implementing CI/CD pipelines using relevant tools (e.g., Cloud Build, Jenkins). - Strong experience with Infrastructure-as-Code (IaC) tools, preferably Terraform or Cloud Deployment Manager. - Scripting skills in languages such as Python, Bash, or Go for automation tasks. - Solid understanding of networking principles and GCP networking services (VPC, Firewall Rules, Load Balancing, Cloud DNS). - Knowledge of cloud security best practices and experience implementing security controls within GCP (IAM, Security Command Center). - Familiarity with Linux operating systems and command-line interface. - Excellent analytical and problem-solving skills with the ability to diagnose and resolve complex technical issues. - Strong written and verbal communication skills with the ability to effectively communicate technical concepts to both technical and non-technical audiences. - Ability to work effectively in a collaborative team environment. - Bachelors degree in Computer Science, Engineering, or a related field. - Minimum of 3-5 years of hands-on experience in managing and implementing solutions on the Google Cloud Platform. - GCP certifications (e.g., Google Cloud Certified - Professional Cloud Architect, Google Cloud Certified - Professional Cloud Engineer) are a significant plus. - Experience working in an Agile development environment is preferred. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

6.0 - 10.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Role Senior Data Analyst. Experience 6 to 10 years. Location Bangalore, Pune, Hyderabad, Gurgaon, Noida. Notice Immediate joiners only. About The Role Data Analyst EDA Exploratory Data Analysis, Communication ,Strong hands-on SQL ,Documentation Exp, GCP Exp, Data pipeline Exp. Requirements - 8+ years experience in Data mining working with large relational databases, succession using advanced data extraction and manipulation tools (for example; Big Query, Teradata, etc.) working with both structured and unstructured data. - Excellent communication skills, both written and verbal able to explain solutions, problems in clear and concise manner. - Experience in conducting business analysis to capture requirements from non-technical partners. - Superb analytical and conceptual thinking skills; to not only to manipulate but also derive relevant interpretations from data. - Proven knowledge of the data management lifecycle, including experience with data quality and metadata management. - Hands on experience in Computer Science, Statistics, Mathematics or Information Systems. - Experience in cloud, GCP Bigquery including but not limited to complex SQL querying. - 1-2 years or experience/exposure in the following : 1. Experience with CI/CD release processes using gitlab,Jira, confluence. 2. Familiarity with creating yaml files, understanding unstructured data such as json. 3. Experience with Looker Studio, Dataplex is a plus. - Hands on engineering experience is an asset. - Exposure to Python, Java nice to have. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai

Work from Office

Job Summary We are seeking a highly analytical and detail-oriented Data Specialist with deep expertise in SQL, Python, statistics, and automation. The ideal candidate will be responsible for designing robust data pipelines, analyzing large datasets, driving insights through statistical methods, and automating workflows to enhance data accessibility and business decision-making. Key Responsibilities - Write and optimize complex SQL queries for data extraction, transformation, and reporting. - Develop and maintain Python scripts for data analysis, ETL processes, and automation tasks. - Conduct statistical analysis to identify trends, anomalies, and actionable insights. - Build and manage automated dashboards and data pipelines using tools such as Airflow, Pandas, or Apache Spark. - Collaborate with cross-functional teams (product, engineering, business) to understand data needs and deliver scalable solutions. - Implement data quality checks and validation procedures to ensure accuracy and consistency. - Support machine learning model deployment and performance tracking (if applicable). - Document data flows, models, and processes for internal knowledge sharing. Key Requirements - Strong proficiency in SQL (joins, CTEs, window functions, performance tuning). - Solid experience with Python (data manipulation using Pandas, NumPy, scripting, and automation). - Applied knowledge of statistics (hypothesis testing, regression, probability, distributions). - Experience with data automation tools (Airflow, dbt, or equivalent). - Familiarity with data visualization tools (Tableau, Power BI, or Plotly) is a plus. - Understanding of data warehousing concepts (e.g., Snowflake, BigQuery, Redshift). - Strong problem-solving skills and the ability to work independently. Preferred Qualifications - Bachelor's or Masters degree in Computer Science, Data Science, Statistics, or a related field. - Exposure to cloud platforms like AWS, GCP, or Azure. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 2 months ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Category: Technology Location: Shuru is a technology-consulting company that embeds senior product and engineering teams into fast-growing companies worldwide to accelerate growth and de-risk strategy Our work is global, high-stakes, and unapologetically business-first, Role Overview Youll join a lean, senior-only business intelligence team as a Senior Data Analyst who will sit shoulder-to-shoulder with our clients, operating as their in-house analytics brain-trust Your mandate: design the data questions worth asking, own the pipelines that answer them, and convert findings into clear, bottom-line actions If you need daily direction, this isnt for you If you see a vague brief as oxygen, read on, Key Responsibilities Frame the right questions Translate ambiguous product or commercial goals into testable hypotheses, selecting the metrics that truly explain user behaviour and unit economics, Own data end-to-end Model, query, and transform data in SQL and dbt, pushing to cloud warehouses such as Snowflake/BigQuery, with zero babysitting, Build self-service BI Deliver dashboards in Metabase/Looker that non-technical stakeholders can tweak without coming back to you every week, Tell unforgettable stories Turn complex analyses into visuals and narratives that drive decisions in the C-suite and on the sprint board, Guard the data moat Champion data governance, privacy, and quality controls that scale across multiple client engagements, Mentor & multiply Level-up engineers and product managers on analytical thinking, setting coding and insight standards for future analysts, Requirements Must-Have Skills & Experience Minimum Experience of 3 years Core Analytics: Expert SQL; comfort with Python or R for advanced analysis; solid grasp of statistical inference and experimentation, Modern Data Stack: Hands-on with dbt, Snowflake/BigQuery/Redshift, and at least one orchestration tool (Airflow, Dagster, or similar), BI & Visualisation: Proven delivery in Metabase, Looker, or Tableau (including performance tuning for big data models ) Product & Growth Metrics: Demonstrated ability to define retention, activation, and LTV/Payback KPI for SaaS or consumer-tech products, Communication: Relentless clarity; you can defend an insight to both engineers and the CFO, and change course when the data disproves you, Independence: History of thriving with ?figure it out? briefs and distributed teams across time zones, Bonus Points Feature-flag experimentation at scale (e-g , Optimizely, LaunchDarkly), Familiarity with privacy-enhancing tech (differential privacy, data clean rooms), Benefits Work on international projects Execute with founders and execs from around the globe, stacking your playbook fast, Regular team outings We fund quarterly off-sites and virtual socials to keep the remote vibe human, Collaborative & growth-oriented Learn directly from CXOs, leads, and seasoned PMs; no silos, no artificial ceilings, Competitive salary & benefits Benchmark ?90th percentile for similar-stage firms, plus performance upside, Details

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies