Jobs
Interviews

1086 Bigquery Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

11.0 - 16.0 years

40 - 45 Lacs

Pune

Work from Office

Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

17 - 30 Lacs

Hyderabad, Bengaluru

Work from Office

Role & responsibilities Primary Skills : Java, Spring or Micronaut frameworks , CI/CD, GCP is must, BigQuery, Terraform, Pub/Sub, deploying data processing workflows via Terraform. JDBC Experience in designing and implementing scalable data architecture. Good to have knowledge of Cloud SQL, Pub/Sub, and Analytics Hub. Experience with BigQuery or similar data warehouse. Experience with Java ingestion connectors running on-premises or GCP. Extensive knowledge of Terraform deployment. Proficiency in data processing from Pub/Sub to BigQuery to Analytics Hub. Designing data schemas to align with BigQuery-native structures. Optimization or testing for production-level loads. Publish transaction data into BigQuery Deploying data processing workflows via Terraform . Identifying and mitigating bottlenecks in ingestion and data processing workflows. Optimizing ingestion adapters and data processing components for performance and resiliency. Java Engineer General Java development skills Review / create / update connectors Spring / MicroNaut frameworks General DevOps skills / tools CI /CD (Jenkins), Cloud Build, Git, GitOps, Terraform JDBC 7+yrs Immediate to 15days Mok@teksystems.com

Posted 2 weeks ago

Apply

5.0 - 8.0 years

25 - 40 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Salary: 25 to 40 LPA Exp: 5 to 11 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 5+ years of experience in analytics, Pyspark, Python, Spark, SQL and associated data engineering jobs. Presales ** Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Exp with Presales Exp in Gen AI POC Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 2 weeks ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We are looking for an experienced Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us Your key responsibilities Responsible for change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the story, maintain risk management controls, monitor and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories): including liaising with end-users and performing analysis of gathered data. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years leading complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree

Posted 2 weeks ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

37 - 40 Lacs

Kochi, Hyderabad, Coimbatore

Work from Office

6+ years of experience in data engineering / warehousing, with at least 2+ years in BigQuery and GCP. Strong expertise in SQL query optimization, BigQuery scripting, and performance tuning. Hands-on experience with ETL/ELT tools like Cloud Dataflow (Apache Beam), Cloud Composer (Airflow), dbt, Talend, Matillion, or Informatica IICS. Experience with Cloud Storage, Pub/Sub, and Dataflow for real-time and batch data ingestion. Proficiency in Python or Java for scripting and data processing tasks. Experience with semi-structured data (JSON, Avro, Parquet) and BigQuery ingestion methods. Familiarity with CI/CD pipelines, Terraform, Git, and Infrastructure as Code (IaC). Strong understanding of data governance, security policies, and compliance standards in GCP. Experience working in Agile/Scrum environments and following DevOps practices.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India Minimum qualifications: Bachelor's degree in Computer Science or equivalent practical experience Experience in automating infrastructure provisioning, Developer Operations (DevOps), integration, or delivery Experience in networking, compute infrastructure (e g , servers, databases, firewalls, load balancers) and architecting, developing, or maintaining cloud solutions in virtualized environments Experience in scripting with Terraform and Networking, DevOps, Security, Compute, Storage, Hadoop, Kubernetes, or Site Reliability Engineering Preferred qualifications: Certification in Cloud with experience in Kubernetes, Google Kubernetes Engine, or similar Experience with customer-facing migration including service discovery, assessment, planning, execution, and operations Experience with IT security practices like identity and access management, data protection, encryption, certificate and key management Experience with Google Cloud Platform (GCP) techniques like prompt engineering, dual encoders, and embedding vectors Experience in building prototypes or applications Experience in one or more of the following disciplines: software development, managing operating system environments (Linux or related), network design and deployment, databases, storage systems About The Job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive We help customers transform and evolve their business through the use of Googles global network, web-scale data centers, and software infrastructure As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems Responsibilities Provide domain expertise in cloud platforms and infrastructure to solve cloud platform tests Work with customers to design and implement cloud based technical architectures, migration approaches, and application optimizations that enable business objectives Be a technical advisor and perform troubleshooting to resolve technical tests for customers Create and deliver best practice recommendations, tutorials, blog articles, and sample code Travel up to 30% for in-region for meetings, technical reviews, and onsite delivery activities Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form

Posted 3 weeks ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India Minimum qualifications: Bachelor's degree in Computer Science or equivalent practical experience Experience in automating infrastructure provisioning, Developer Operations (DevOps), integration, or delivery Experience in networking, compute infrastructure (e g , servers, databases, firewalls, load balancers) and architecting, developing, or maintaining cloud solutions in virtualized environments Experience in scripting with Terraform and Networking, DevOps, Security, Compute, Storage, Hadoop, Kubernetes, or Site Reliability Engineering Preferred qualifications: Certification in Cloud with experience in Kubernetes, Google Kubernetes Engine, or similar Experience with customer-facing migration including service discovery, assessment, planning, execution, and operations Experience with IT security practices like identity and access management, data protection, encryption, certificate and key management Experience with Google Cloud Platform (GCP) techniques like prompt engineering, dual encoders, and embedding vectors Experience in building prototypes or applications Experience in one or more of the following disciplines: software development, managing operating system environments (Linux or related), network design and deployment, databases, storage systems About The Job The Google Cloud Consulting Professional Services team guides customers through the moments that matter most in their cloud journey to help businesses thrive We help customers transform and evolve their business through the use of Googles global network, web-scale data centers, and software infrastructure As part of an innovative team in this rapidly growing business, you will help shape the future of businesses of all sizes and use technology to connect with customers, employees, and partners Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems Responsibilities Provide domain expertise in cloud platforms and infrastructure to solve cloud platform tests Work with customers to design and implement cloud based technical architectures, migration approaches, and application optimizations that enable business objectives Be a technical advisor and perform troubleshooting to resolve technical tests for customers Create and deliver best practice recommendations, tutorials, blog articles, and sample code Travel up to 30% for in-region for meetings, technical reviews, and onsite delivery activities Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form

Posted 3 weeks ago

Apply

7.0 - 11.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Were Hiring: Python + GCP Engineer (Dataiku) Experience: 68 Years Location: Bangalore / Chennai / Gurugram Company: Derisk360 Are you passionate about building data solutions on the cloud and experienced in Python and GCP BigQueryJoin our team to build scalable pipelines, automate processes, and enable smarter decision-making with Dataiku and modern cloud tooling What Youll Do: Build and manage data workflows in Dataiku, including partitioned datasets Develop and optimize Python scripts for complex data transformations using pandas, numpy, and regex Integrate data sources and orchestrate transformations using Google Cloud Platform (BigQuery) Use Terraform to manage infrastructure and deployment of code changes in a scalable, automated fashion Collaborate with data scientists, analysts, and DevOps to deliver end-to-end data solutions What You Bring: 52+ years of hands-on experience with Dataiku, particularly working with partitioned datasets Strong command of Python, especially for data handling using pandas and numpy Practical knowledge of regular expressions (regex) for data cleaning and transformation Proficiency with GCP BigQuery for querying and managing large datasets Experience using Terraform to manage cloud infrastructure and code deployments Nice to Have: Exposure to CI/CD practices in data workflows Prior experience in data governance, metadata management, or MLOps environments What Youll Get: Competitive compensation and hybrid flexibility Opportunity to work on large-scale, impactful cloud data projects Collaborative culture with a strong focus on technical excellence and learning Access to cutting-edge tools in the data engineering ecosystem

Posted 3 weeks ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

Pune

Work from Office

Job Title Lead Engineer Location Pune Corporate Title Director As a lead engineer within the Transaction Monitoring department, you will lead and drive forward critical engineering initiatives and improvements to our application landscape whilst supporting and leading the engineering teams to excel in their roles. You will be closely aligned to the architecture function and delivery leads, ensuring alignment with planning and correct design and architecture governance is followed for all implementation work. You will lead by example and drive and contribute to automation and innovation initiatives with the engineering teams. Join the fight against financial crime with us! Your key responsibilities Experienced hands-on cloud and on-premise engineer, leading by example with engineering squads Thinking analytically, with systematic and logical approach to solving complex problems, and high attention to detail Design & document complex technical solutions at varying levels in an inclusive and participatory manner with a range of stakeholders Liaise and face-off directly to stakeholders in technology, business and modelling areas Collaborating with application development teams to design and prototype solutions (both on-premises and on-cloud), supporting / presenting these via the Design Authority forum for approval and providing good practice and guidelines to the teams Ensuring engineering & architecture compliance with bank-standard processes for deploying new applications, working directly with central functions such as Group Architecture, Chief Security Office and Data Governance Innovate and think creatively, showing willingness to apply new approaches to solving problems and to learn new methods, technologies and potentially outside-of-box solution Your skills and experience Proven hands-on engineering and design experience in a delivery-focused (preferably agile) environment Solid technical/engineering background, preferably with at least two high level languages and multiple relational databases or big-data technologies Proven experience with cloud technologies, preferably GCP (GKE / DataProc / CloudSQL / BigQuery), GitHub & Terraform Competence / expertise in technical skills across a wide range of technology platforms and ability to use and learn new frameworks, libraries and technologies A deep understanding of the software development life cycle and the waterfall and agile methodologies Experience leading complex engineering initiatives and engineering teams Excellent communication skills, with demonstrable ability to interface and converse at both junior and level and with non-IT staff Line management experience including working in a matrix management configuration How well support you Training and development to help you excel in your career Flexible working to assist you balance your personal priorities Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 weeks ago

Apply

7.0 - 9.0 years

11 - 16 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Technical and Professional : Technology-Cloud Platform-GCP Database-Google BigQuery Skills: Technology-Cloud Platform-GCP Core Services Technology-Cloud Platform-Azure Devops-data on cloud-gcp Technology-Cloud Platform-GCP App Development EXP- 2+yrs Location- PAN INDIA

Posted 3 weeks ago

Apply

5.0 - 7.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Minimum qualifications: Bachelor's degree or equivalent practical experience 5 years of experience with software development in one or more programming languages, and with data structures/algorithms 3 years of experience testing, maintaining or launching software products 1 year of experience with software design and architecture Experience in Web3 0 technologies and concepts such as blockchains, digital assets, dapps, cryptocurrency Preferred qualifications: Experience with Google Cloud technologies (e g , Kubernetes, BigQuery, Cloud SQL, Cloud ESF, etc) Experience with blockchain related technologies Experience building highly available, real-time distributed systems Experience with GenAI, AI Agents, or related technologies About The Job Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another Our products need to handle information at massive scale, and extend well beyond web search We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day As a software engineer, you will work on a specific project critical to Googles needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward Google Cloud accelerates every organizations ability to digitally transform its business and industry We deliver enterprise-grade solutions that leverage Googles cutting-edge technology, and tools that help developers build more sustainably Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems Responsibilities Build a novel Cloud Web3 assistant product powered by an AI thinking agent and a suite of tools Design and implement core components of the Google Cloud Web3 portal, Blockchain RPC, faucets, validators, datasets, and more, ensuring availability and security Write reliable, efficient and testable software, take ownership of projects, meet deadlines and deliver high-quality work, including documentation and design contributions Collaborate effectively within a globally distributed team Collaborate with peers and stakeholders through design and code reviews to ensure best practices amongst available technologies (e g style guidelines, checking code in accuracy, testability, and efficiency) Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form

Posted 3 weeks ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Bengaluru

Work from Office

Experience: 4 6 Years Location: Bangalore ( Hybrid) Shift :Night Shift Employment Type: Full-time About the Role: We are seeking a skilled and motivated Analytics Engineer with 46 years of experience to join our data team in Bangalore The ideal candidate will possess a strong mix of data engineering, analytics, and stakeholder collaboration skills You will play a key role in designing scalable data solutions and enabling data-driven decision-making across the organization Key Responsibilities: Collaborate with business and technical stakeholders to gather requirements and deliver analytics solutions Design and implement scalable data models using star schema and dimensional modeling approaches Develop and optimize ETL pipelines using Apache Spark for both batch and real-time processing(experience with Apache Pulsar is preferred) Write efficient, production-grade Python scripts and advanced SQL queries for data transformation and analysis Manage workflows and ensure data pipeline reliability using tools like Airflow, DBT, or similar orchestration frameworks Implement best practices in data quality, testing, and observability across all data layers Work with cloud-native data lakes/warehouses such as Redshift, BigQuery, Cassandra, and cloud storage platforms (S3, Azure Blob, GCS) Leverage relational databases such as PostgreSQL/MySQL for operational data tasks Nice to Have: Exposure to containerization technologies like Docker and Kubernetes for scalable deployment Experience working in cloud-native analytics ecosystems Required Skills: Strong experience in data modeling, ETL development, and data warehouse design Proven expertise in Python and SQL Hands-on experience with Apache Spark (ETL tuning), Airflow, DBT, or similar tools Practical knowledge of data quality frameworks, monitoring, and data observability Familiarity with both batch and streaming data architectures What We Offer: Opportunity to work on cutting-edge data platforms Collaborative and inclusive team culture Competitive salary and benefits Career growth in the modern data engineering space

Posted 3 weeks ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Pune, Chennai, Bengaluru

Hybrid

Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Science,Bachelor Of Comp. Applications,Master Of Technology,Master Of Engineering,Master Of Comp. Applications,Master Of Science Service Line Engineering Services Responsibilities Master's degree in Computer Science, Statistics, Mathematics, or a related field. 3+ years of experience in data science and machine learning with a strong focus on model development and deployment. Expert-level knowledge of statistics, including probability theory, hypothesis testing, and statistical inference. In-depth knowledge of machine learning algorithms, including linear regression, logistic regression, decision trees Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Technical and Professional : Experience with Natural Language Processing (NLP) and Computer Vision (CV) techniques. Knowledge of DevOps methodologies and practices for continuous integration/continuous delivery (CI/CD). Experience with data warehousing and data lakes solutions like BigQuery or Snowflake. Familiarity with real-time data processing and streaming analytics. Passion for learning and staying at the forefront of data science and machine learning advancements. Preferred Skills: Technology-Analytics - Techniques-Cluster Analysis Technology-Analytics - Techniques-Decision Trees Technology-Analytics - Techniques-Linear Regression Technology-Machine Learning-Python

Posted 3 weeks ago

Apply

6.0 - 9.0 years

9 - 15 Lacs

Bengaluru

Hybrid

Job Description: We are hiring a Java Developer with strong GCP experience, or a GCP Engineer proficient in Java. The candidate should be capable of developing scalable cloud-native applications using Google Cloud services. Key Skills: Java, Spring Boot, RESTful APIs Google Cloud Platform (GCP) Cloud functions, Pub/Sub, BigQuery (preferred) CI/CD, Docker, Kubernetes

Posted 3 weeks ago

Apply

5.0 - 8.0 years

20 - 35 Lacs

Bengaluru

Work from Office

Key Responsibilities: •Design, develop, and optimize data models within the Celonis Execution Management System (EMS). •Extract, transform, and load (ETL) data from flat files and UDP into Celonis. • Work closely with business stakeholders and data analysts to understand data requirements and ensure accurate representation of business processes. •Develop and optimize PQL (Process Query Language) queries for process mining. • Collaborate with group data engineers, architects, and analysts to ensure high-quality data pipelines and scalable solutions. •Perform data validation, cleansing, and transformation to enhance data quality. •Monitor and troubleshoot data integration pipelines, ensuring performance and reliability. •Provide guidance and best practices for data modeling in Celonis. Qualifications & Skills: • 5+ years of experience in data engineering, data modeling, or related roles Proficiency in SQL, ETL processes, and database management (e.g.,PostgreSQL, Snowflake, BigQuery, or similar). •Experience working with large-scale datasets and optimizing data models for performance. •Data management experience that spans across the data lifecycle and critical functions (e.g., data profiling, data modeling, data engineering, data consumption product and services). • Strong problem-solving skills and ability to work in an agile, fast-paced environment. •Excellent communications and demonstrated hands-on experience communicating technical topics with non-technical audiences. • Ability to effectively collaborate and manage the timely completion of assigned activities while working in a highly virtual team environment. •Excellent collaboration skills to work with cross-functional teams Roles and Responsibilities Key Responsibilities: •Design, develop, and optimize data models within the Celonis Execution Management System (EMS). •Extract, transform, and load (ETL) data from flat files and UDP into Celonis. • Work closely with business stakeholders and data analysts to understand data requirements and ensure accurate representation of business processes. •Develop and optimize PQL (Process Query Language) queries for process mining. • Collaborate with group data engineers, architects, and analysts to ensure high-quality data pipelines and scalable solutions. •Perform data validation, cleansing, and transformation to enhance data quality. •Monitor and troubleshoot data integration pipelines, ensuring performance and reliability. •Provide guidance and best practices for data modeling in Celonis. Qualifications & Skills: • 5+ years of experience in data engineering, data modeling, or related roles Proficiency in SQL, ETL processes, and database management (e.g.,PostgreSQL, Snowflake, BigQuery, or similar). •Experience working with large-scale datasets and optimizing data models for performance. •Data management experience that spans across the data lifecycle and critical functions (e.g., data profiling, data modeling, data engineering, data consumption product and services). • Strong problem-solving skills and ability to work in an agile, fast-paced environment. •Excellent communications and demonstrated hands-on experience communicating technical topics with non-technical audiences. • Ability to effectively collaborate and manage the timely completion of assigned activities while working in a highly virtual team environment. •Excellent collaboration skills to work with cross-functional teams

Posted 3 weeks ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Reliable backend services and cloud-native applications. Kubernetes with best practices in availability, monitoring, and cost-efficiency. Implement and manage CI/CD pipelines and infrastructure automation. Collaborate with frontend, Required Candidate profile Kubernetes, Cloud platform (GCP, AWS, Azure, or OCI), Backend Programming (Python, Java, or Kotlin) Strong hands-on experience with Kubernetes of atleast 2 years in production environments.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Data Visualization Software Developer Engineer (5-8 Years Experience) Role Overview: We are looking for a skilled Data Visualization Software Developer Engineer with 6-8 years of experience in developing interactive dashboards and data-driven solutions using Looker and LookerML. The ideal candidate will have expertise in Google Cloud Platform (GCP) and BigQuery and a strong understanding of data visualization best practices. Experience in the media domain (OTT, DTH, Web) will be a plus. Key Responsibilities: Work with BigQuery to create efficient data models and queries for visualization. Develop LookML models, explores, and derived tables to support business intelligence needs. Optimize dashboard performance by implementing best practices in data aggregation and visualization. Collaborate with data engineers, analysts, and business teams to understand requirements and translate them into actionable insights. Implement security and governance policies within Looker to ensure data integrity and controlled access. Leverage Google Cloud Platform (GCP) services to build scalable and reliable data solutions. Maintain documentation and provide training to stakeholders on using Looker dashboards effectively. Troubleshoot and resolve issues related to dashboard performance, data accuracy, and visualization constraints. Maintain and optimize existing Looker dashboards and reports to ensure continuity and alignment with business KPIs Understand, audit, and enhance existing LookerML models to ensure data integrity and performance Build new dashboards and data visualizations based on business requirements and stakeholder input Collaborate with data engineers to define and validate data pipelines required for dashboard development and ensure the timely availability of clean, structured data Document existing and new Looker assets and processes to support knowledge transfer, scalability, and maintenance Support the transition/handover process by acquiring detailed knowledge of legacy implementations and ensuring a smooth takeover Required Skills & Experience: 6-8 years of experience in data visualization and business intelligence using Looker and LookerML. Strong proficiency in writing and optimizing SQL queries, especially for BigQuery. Experience in Google Cloud Platform (GCP), particularly with BigQuery and related data services. Solid understanding of data modeling, ETL processes, and database structures. Familiarity with data governance, security, and access controls in Looker. Strong analytical skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills. Expertise in Looker and LookerML, including Explore creation, Views, and derived tables Strong SQL skills for data exploration, transformation, and validation Experience in BI solution lifecycle management (build, test, deploy, maintain) Excellent documentation and stakeholder communication skills for handovers and ongoing alignment Strong data visualization and storytelling abilities, focusing on user-centric design and clarity Preferred Qualifications: Experience working in the media industry (OTT, DTH, Web) and handling large-scale media datasets. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus. Experience with Python or scripting languages for automation and data processing. Understanding of machine learning or predictive analytics is an advantage

Posted 3 weeks ago

Apply

3.0 - 8.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : Microsoft 365, Microsoft PowerShell, Microsoft 365 Security & Compliance Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Tech Support Practitioner, you will serve as the vital link between clients and the systems or applications they utilize. Your typical day will involve engaging with clients to understand their needs, addressing any issues they encounter, and ensuring that our high-quality systems operate seamlessly. You will leverage your exceptional communication skills to provide clarity and support, while also utilizing your in-depth product knowledge to design effective resolutions tailored to client requirements. Your commitment to quality will be evident in every interaction, as you strive to maintain the integrity and performance of our world-class systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate training sessions for team members to enhance their understanding of system functionalities.- Develop and maintain comprehensive documentation for troubleshooting processes and client interactions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft 365, Microsoft PowerShell, Microsoft 365 Security & Compliance.- Strong understanding of cloud-based solutions and their implementation.- Experience with system integration and application support.- Ability to analyze and resolve technical issues efficiently.- Familiarity with security protocols and compliance standards related to Microsoft 365. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft 365.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : Microsoft 365, Microsoft PowerShell, Microsoft 365 Security & Compliance Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Tech Support Practitioner, you will serve as the vital link between clients and the systems or applications they utilize. Your typical day will involve engaging with clients to understand their needs, addressing their concerns, and ensuring that our world-class systems operate seamlessly. You will leverage your exceptional communication skills to provide clarity and support, while also utilizing your deep product knowledge to design effective resolutions for any issues that arise. Your commitment to quality will be evident in every interaction, as you strive to enhance the client experience and maintain system integrity. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate training sessions for team members to enhance their understanding of system functionalities.- Develop and maintain comprehensive documentation for processes and procedures to ensure consistency and quality. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft 365, Microsoft PowerShell, Microsoft 365 Security & Compliance.- Strong understanding of cloud-based solutions and their implementation.- Experience with troubleshooting and resolving technical issues related to Microsoft 365 applications.- Familiarity with security protocols and compliance measures within Microsoft 365 environments.- Ability to communicate technical information effectively to non-technical stakeholders. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft 365.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Gurugram

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Software License Management Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Competent on any 2 tier 1 publishers (Microsoft, Oracle, IBM, VMware, SAP) & any 2 Tier 2 publishers (Salesforce, Adobe, Quest, Autodesk, Microfocus, Citrix, Veritas, Informatica). Hands on experience on ServiceNow SAM Pro / Flexera / SNOW SLM. Good understanding of publisher contracts, license metrics and product use rights. Experience in creation of entitlements, license overview report and contracts. Experience in handling software license requests and performing technical validation. Key ResponsibilitiesMaintain software publisher licensing information for the assigned publishers (i.e., both entitlements and deployments) Analyze software licensing agreements, create entitlements summary, and summarize use right information from software agreements. Importing licenses and agreements into the SAM tool (SNOW SLM/ SAM Pro, Flexera/Others). Update software entitlement and agreement information into the SAM tool. Maintain accurate records of software licenses and related assets, ensuring compliance with licensing agreements and regulations. Develop and implement software license management policies and procedures, ensuring adherence to industry best practices and standards. Maintain software installation records in SAM tool and perform product normalization. Perform license reconciliation in SAM tool. Work with internal stakeholders to ensure deployment of software applications are compliant and if not, work with the stakeholders to remediate non-compliance. Respond to customer queries on software licensing. Create customized reports and recommendations to report on SAM function activities. Identify cost savings and license re-harvesting opportunities. Drive periodic or ad-hoc stakeholder and project meetings. Technical ExperienceExcellent command over software licensing and use rights information of tier 1 software publishers (i.e., Microsoft, Oracle, IBM, VMware, Adobe, Citrix, and SAP) Proficient in creating and delivering IBM Sub-Capacity Mainframe ELP reports Proficient in creating Oracle DB server and Options ELP reports. Performing manual reconciliation and deployment validation as required Experience working on at least one or more SAM Tools (i.e., ServiceNow SAMPro, Flexera, SNOW License Manager) Professional AttributesExcellent communication skills Expert knowledge in MS Office applications (Excel & PowerPoint) Ability to work in a team environment. Must have Skills: Software licensing & Software Asset Management Tools Good to Have Skills: Analytical and Communication Skills Candidate should be flexible on doing shifts and coming to office. Educational Qualification15 years of full-time education Desired Certifications:CSAM CITAM FlexNet Manager Implementation & Administration Flexera Certified IT Asset Management Administrator Qualification 15 years full time education

Posted 3 weeks ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Pune

Work from Office

Role Overview:As a Senior Principal Software Engineer, you will be a key technical leader responsible for shaping the design and development of scalable, reliable, and innovative AI/GenAI solutions. You will lead high priority projects, set technical direction for teams, and ensure alignment with organizational goals. Thisrole demands a high degree of technical expertise, strategic thinking, and the ability to collaborate effectively across diverse teams while mentoring and elevating others to meet a very high technical bar. Key Responsibilities: Strategic Technical Leadership : Define and drive the technical vision and roadmap for AI/GenAI systems, aligning with company objectives and future growth. Provide architectural leadership for complex, large-scale AI systems, ensuring scalability, performance, and maintainability. Act as a thought leader in AI technologies, influencing cross-functional technical decisions and long-term strategies. Advanced AI Product Development: Lead the development of state-of-the-art generative AI solutions, leveraging advanced techniques such as transformer models, diffusion models, and multi-modal architectures. Drive innovation by exploring and integrating emerging AI technologies and best practices. Mentorship & Team Growth: Mentor senior and junior engineers, fostering a culture of continuous learning and technical excellence. Elevate the team’s capabilities through coaching, training, and providing guidance on best practices and complex problem-solving. End-to-End Ownership: Take full ownership of high-impact projects, from ideation and design to implementation, deployment, and monitoring in production. Ensure the successful delivery of projects with a focus on quality, timelines, and alignment with organizational goals. Collaboration & Influence: Collaborate with cross-functional teams, including product managers, data scientists, and engineering leadership, to deliver cohesive and impactful solutions. Act as a trusted advisor to stakeholders, clearly articulating technical decisions and their business impact. Operational Excellence: Champion best practices for software development, CI/CD, and DevOps, ensuring robust and reliable systems. Monitor and improve the health of deployed services, conducting root cause analyses and driving preventive measures for long-term reliability. Innovation & Continuous Improvement: Advocate for and lead the adoption of new tools, frameworks, and methodologies to enhance team productivity and product capabilities. Stay at the forefront of AI/GenAI research, driving thought leadership and contributing to the AI community through publications or speaking engagements. Minimum Qualifications: Educational BackgroundBachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field; Ph.D. is preferred but not required. Experience10+ years of professional software development experience, including 5+ years in AI/ML or GenAI. Proven track record of designing and deploying scalable, production-grade AI solutions. Deep expertise in Python and frameworks such as TensorFlow, PyTorch, FastAPI, and LangChain. Advanced knowledge of AI/ML algorithms, generative models, and LLMs. Proficiency with cloud platforms (e.g., GCP, AWS, Azure) and modern DevOps practices. Strong understanding of distributed systems, microservices architecture, and database systems (SQL/NoSQL). Leadership Skills: Demonstrated ability to lead complex technical initiatives, influence cross functional teams, and mentor engineers at all levels. Problem-Solving Skills: Exceptional analytical and problem-solving skills, with a proven ability to navigate ambiguity and deliver impactful solutions. CollaborationExcellent communication and interpersonal skills, with the ability to engage and inspire both technical and non-technical stakeholders. Preferred Qualifications: AI/ML ExpertiseExperience with multi-modal models, reinforcement learning, and responsible AI principles. Cloud & InfrastructureAdvanced knowledge of GCP technologies such as VertexAI, BigQuery,GKE, and DataFlow. Thought LeadershipContributions to the AI/ML community through publications, open-source projects, or speaking engagements. Agile ExperienceFamiliarity with agile methodologies and working in a DevOps model. Disability Accommodation: UKGCareers@ukg.com.

Posted 3 weeks ago

Apply

9.0 - 11.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Science,Bachelor Of Comp. Applications,Master Of Technology,Master Of Engineering,Master Of Comp. Applications,Master Of Science Service Line Engineering Services Responsibilities Master's degree in Computer Science, Statistics, Mathematics, or a related field. 7+ years of experience in data science and machine learning with a strong focus on model development and deployment. Expert-level knowledge of statistics, including probability theory, hypothesis testing, and statistical inference. In-depth knowledge of machine learning algorithms, including linear regression, logistic regression, decision trees Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Technical and Professional : Experience with Natural Language Processing (NLP) and Computer Vision (CV) techniques. Knowledge of DevOps methodologies and practices for continuous integration/continuous delivery (CI/CD). Experience with data warehousing and data lakes solutions like BigQuery or Snowflake. Familiarity with real-time data processing and streaming analytics. Passion for learning and staying at the forefront of data science and machine learning advancements. Preferred Skills: Technology-Analytics - Techniques-Cluster Analysis Technology-Analytics - Techniques-Decision Trees Technology-Analytics - Techniques-Linear Regression Technology-Machine Learning-Python

Posted 3 weeks ago

Apply

9.0 - 11.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Science,Bachelor Of Comp. Applications,Master Of Technology,Master Of Engineering,Master Of Comp. Applications,Master Of Science Service Line Engineering Services Responsibilities Evaluate the performance of machine learning models and refine them to improve accuracy andgeneralizability. Communicate data insights to stakeholders in a clear and concise manner, using data visualization techniques and storytelling. collaborate with data engineers, software developers, and business stakeholders to integrate data science solutions into products and services. Stay up-to-date with the latest trends and developments in data science, machine learning, and artificial intelligence. Additional Responsibilities: GExperience with Natural Language Processing (NLP) and Computer Vision (CV) techniques. Knowledge of DevOps methodologies and practices for continuous integration/continuous delivery (CI/CD). Experience with data warehousing and data lakes solutions like BigQuery or Snowflake. Familiarity with real-time data processing and streaming analytics. Passion for learning and staying at the forefront of data science and machine learning advancements. Technical and Professional : Master's degree in Computer Science, Statistics, Mathematics, or a related field. 7+ years of experience in data science and machine learning with a strong focus on model development and deployment. Expert-level knowledge of statistics, including probability theory, hypothesis testing, and statistical inference. In-depth knowledge of machine learning algorithms, including linear regression, logistic regression, decision trees, random forests, xgboost, and ensemble learning. Strong programming skills in Python and proficiency in data science libraries like pandas, scikit-learn, numpy, Pytorch/Keras, and TensorFlow. Experience with cloud computing platforms, particularly Google Cloud Platform (GCP). Excellent data visualization skills using tools like matplotlib, seaborn, or Tableau. Strong communication and presentation skills, both written and verbal. Preferred Skills: Technology-Analytics - Techniques-Cluster Analysis Technology-Analytics - Techniques-Decision Trees Technology-Analytics - Techniques-Linear Regression Technology-Machine Learning-Python

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data modeling and database design principles.- Experience with SQL and data manipulation techniques.- Familiarity with application development frameworks and methodologies.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies