Home
Jobs

810 Bigquery Jobs - Page 31

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 12 years

9 - 14 Lacs

Bengaluru

Work from Office

Job Title - Data Scientist and Analytics Level 9:Consultant Ind & Func AI Decision Science Consultant S&C Management Level:09 - Consultant Location Bangalore/Gurgaon/Hyderabad/Mumbai Must have skills: Technical (Python, SQL, ML and AI), Functional (Data Scientist and B2B Analytics preferably in Telco and S&P industries) Good to have skills GEN AI, Agentic AI, cloud (AWS/Azure, GCP) Job Summary : About Global Network Data & AI:- Accenture Strategy & Consulting Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition About Comms & Media practice: The Accenture Center for Data and Insights (CDI) team helps businesses integrate data and AI into their operations to drive innovation and business growth by designing and implementing data strategies, generating actionable insights from data, and enabling clients to make informed decisions. In CDI, we leverage AI (predictive + generative), analytics, and automation to build innovative and practical solutions, tools and capabilities. The team is also working on building and socializing a Marketplace to democratize data and AI solutions within Accenture and for clients. Globally, CDI practice works across industry to develop value growth strategies for its clients and infuse AI & GenAI to help deliver top their business imperatives i.e., revenue growth & cost reduction. From multi-year Data & AI transformation projects to shorter more agile engagements, we have a rapidly expanding portfolio of hyper-growth clients and an increasing footprint with next-gen solutions and industry practices. Roles & Responsibilities: Experienced in Analytics in B2B domain. Responsible to help the clients with designing & delivering AI/ML solutions. He/she should be strong in Telco and S&P domain, AI fundamentals and should have good hands-on experience working with the following: Ability to work with large data sets and present conclusions to key stakeholders; Data management using SQL. Data manipulation and aggregation using Python. Propensity modeling using various ML algorithms. Text mining using NLP/AI techniques Propose solutions to the client based on gap analysis for the existing Telco platforms that can generate long term & sustainable value to the client. Gather business requirements from client stakeholders via interactions like interviews and workshops with all stakeholders Track down and read all previous information on the problem or issue in question. Explore obvious and known avenues thoroughly. Ask a series of probing questions to get to the root of a problem. Ability to understand the as-is process; understand issues with the processes which can be resolved either through Data & AI or process solutions and design detail level to-be state Understand customer needs and identify/translate them to business requirements (business requirement definition), business process flows and functional requirements and be able to inform the best approach to the problem. Adopt a clear and systematic approach to complex issues (i.e. A leads to B leads to C). Analyze relationships between several parts of a problem or situation. Anticipate obstacles and identify a critical path for a project. Independently able to deliver products and services that empower clients to implement effective solutions. Makes specific changes and improvements to processes or own work to achieve more. Work with other team members and make deliberate efforts to keep others up to date. Establish a consistent and collaborative presence with clients and act as the primary point of contact for assigned clients; escalate, track, and solve client issues. Partner with clients to understand end clients' business goals, marketing objectives, and competitive constraints. Storytelling Crunch the data & numbers to craft a story to be presented to senior client stakeholders. Professional & Technical Skills: Overall 8 years of experience in Data Science B.Tech Engineering from Tier 1 school or Msc in Statistics/Data Science from a Tier 1/Tier 2 Demonstrated experience in solving real-world data problems through Data & AI Direct onsite experience (i.e., experience of facing client inside client offices in India or abroad) is mandatory. Please note we are looking for client facing roles. Proficiency with data mining, mathematics, and statistical analysis Advanced pattern recognition and predictive modeling experience; knowledge of Advanced analytical fields in text mining, Image recognition, video analytics, IoT etc. Execution level understanding of econometric/statistical modeling packages Traditional techniques like Linear/logistic regression, multivariate statistical analysis, time series techniques, fixed/Random effect modelling. Machine learning techniques like - Random Forest, Gradient Boosting, XG boost, decision trees, clustering etc. Knowledge of Deep learning modeling techniques like RNN, CNN etc. Experience using digital & statistical modeling software Python (must), R, PySpark, SQL (must), BigQuery, Vertex AI Proficient in Excel, MS word, Power point, and corporate soft skills Knowledge of Dashboard creation platforms Excel, tableau, Power BI etc. Excellent written and oral communication skills with ability to clearly communicate ideas and results to non-technical stakeholders. Strong analytical, problem-solving skills and good communication skills Self-Starter with ability to work independently across multiple projects and set priorities Strong team player Proactive and solution oriented, able to guide junior team members. Execution knowledge of optimization techniques is a good-to-have Exact optimization Linear, Non-linear optimization techniques Evolutionary optimization Both population and search-based algorithms Cloud platform Certification, experience in Computer Vision are good-to-haves Qualifications Experience: B.Tech Engineering from Tier 1 school or Msc in Statistics/Data Science from a Tier 1/Tier 2 Educational Qualification: B.tech or MSC in Statistics and Data Science

Posted 1 month ago

Apply

15 - 20 years

17 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Insights & Visualization Practition Project Role Description : Create interactive interfaces that enable humans to understand, interpret, and communicate complex data and insights. Wrangle, analyze, and prepare data to ensure delivery of relevant, consistent, timely, and actionable insights. Leverage modern business intelligence, storytelling, and web-based visualization tools to create interactive dashboards, reports and emerging VIS/BI artifacts. Use and customize (Gen)AI and AI-powered VIS/BI capabilities to enable a dialog with data. Must have skills : Data Analytics Good to have skills : NA Minimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an Analytics & Insights Lead to help lead and design innovative approaches to view and interact with data for active analysis and reporting. This team member should bring a wide experience and/or understanding of the art of the possible with analyzing and visualizing data in market relevant technologies (Qlik, Power BI, BigQuery, PowerApps, Alteryx, etc.) and remain flexible to deploy designs quickly (iterating as required). There is also an opportunity to mentor and develop analytics team members from a people developer perspective. Roles & Responsibilities: Drive accountability by ensuring high quality operational monitoring, analysis and continuous improvement of enforcement and KPls. Work with cross-functional stakeholders to establish shared goals and bring role/scope clarity in a fast-paced and ambiguous environment. Use contemporary tools and technology that will provide data analytics and insights to increase revenue, grow profitability, and improve the user experience. Should have Influencing and Advisory skills. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Develop innovative data visualization strategies. Collaborate with stakeholders to understand data requirements. Implement data visualization best practices. Professional & Technical Skills: Pro-active with decision making, analytical thinking and problem solving skills Strong interpersonal, collaboration, and communication skills Be comfortable and effective in a distributed team and remote working environment Must To Have Skills:Proficiency in Data Analytics. Experience with data visualization tools such as Tableau, Qlik, Power BI, Alteryx, BigQuery, PowerApps Additional Information: The candidate should have a minimum of 15 years of experience in Data Analytics. Minimum 2 years of relevant design, development and deployment experience with Qlik and Power BI (includes dashboards, executive summaries, front end visualizations etc.) Minimum 2 years of Data Technology experience which may include:architecture/database development experience and experience with Business Intelligence tools (such as:GCP BigQuery, PowerApps, Alteryx), methodologies, and/or responsibilities This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Google BigQuery Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with Integration Architects and Data Architects to design and implement data platform components. Ensure seamless integration between various systems and data models. Develop and maintain data platform blueprints. Implement data governance policies and procedures. Conduct performance tuning and optimization of data platform components. Professional & Technical Skills: Must To Have Skills:Proficiency in Databricks Unified Data Analytics Platform. Good To Have Skills:Experience with Google BigQuery. Strong understanding of data platform architecture and design principles. Hands-on experience in implementing data pipelines and ETL processes. Proficient in SQL and other query languages. Knowledge of cloud platforms such as AWS or Azure. Additional Information: The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Pune office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Python (Programming Language), Google BigQuery Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented efficiently and effectively, while meeting the needs of the organization. Your typical day will involve collaborating with the team, making team decisions, engaging with multiple teams, and providing solutions to problems for your immediate team and across multiple teams. You will also contribute to key decisions and provide expertise in application development. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Design, build, and configure applications to meet business process and application requirements Ensure that the applications are developed and implemented efficiently and effectively Contribute expertise in application development Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark Good To Have Skills:Experience with Apache Spark, Python (Programming Language), Google BigQuery Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 5 years of experience in PySpark This position is based at our Chennai office A 15 years full time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

7 - 12 years

9 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the application development process.- Implement best practices for application design and development.- Conduct code reviews and ensure code quality standards are met.- Mentor junior team members to enhance their skills. Professional & Technical Skills:- Must To Have Skills:Proficiency in Apache Spark.- Good To Have Skills:Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery.- Strong understanding of distributed computing and parallel processing.- Experience in developing scalable and high-performance applications using Apache Spark.- Knowledge of data processing frameworks and tools in the big data ecosystem. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Apache Spark.- This position is based at our Chennai office.- A 15 years full-time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

12 - 17 years

14 - 19 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : PySpark, Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented according to the specified requirements and standards. Your typical day will involve collaborating with the team to understand the business needs, designing and developing applications using Apache Spark, and configuring the applications to meet the required functionality. You will also be involved in testing and debugging the applications to ensure their quality and performance. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Design and build applications based on business process and application requirements. Configure applications to meet the required functionality. Collaborate with the team to understand the business needs and translate them into technical requirements. Test and debug applications to ensure their quality and performance. Provide technical guidance and support to the team. Stay updated with the latest technologies and trends in application development. Identify and resolve any issues or challenges in the application development process. Ensure that the applications are developed and implemented according to the specified requirements and standards. Professional & Technical Skills: Must To Have Skills:Proficiency in Apache Spark, PySpark, Google BigQuery. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 12 years of experience in Apache Spark. This position is based at our Gurugram office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

12 - 17 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education We are seeking a highly motivated and experienced DevOps Infra Engineer to join our team and manage Kubernetes (K8) infrastructure. You will be responsible for implementing and maintaining Infrastructure as Code (IaC) using Terraform or relevant code and ensuring the smooth deployment and management of our Kubernetes Stack in On Prem environments. You will also be instrumental in troubleshooting issues, optimizing infrastructure, and implementing / Managing monitoring tools for observability. Primary Skills: Kubernetes, Kubegres, Kubekafka, Grafana, Redis, PrometheusSecondary Skills: Keycloak,MetalLB,Ingress,ElasticSearch,Superset,OpenEBS,Istio,Secrets,Helm,NussKnacker,Valero,DruidResponsibilities:Containerization:Working experience with Kubernetes and Docker for containerized application deployments in On Prem ( GKE/K8s ). Knowledge of Helm charts and their application in Kubernetes clusters. Collaboration and Communication:Work effectively in a collaborative team environment with developers, operations, and other stakeholders. Communicate technical concepts clearly and concisely. CI/CD:Design and implement CI/CD pipelines using Jenkins, including pipelines, stages, and jobs. Utilize Jenkins Pipeline and Groovy scripting for advanced pipeline automation. Integrate Terraform with Jenkins for IaC management and infrastructure provisioning. Infrastructure as Code (IaC):Develop and manage infrastructure using Terraform, including writing Terraform tfvars and modules code. Set up IaC pipelines using Terraform, Jenkins, and cloud environments like Azure and GCP. Troubleshoot issues in Terraform code and ensure smooth infrastructure deployments. Cloud Platforms:Possess a deep understanding of both Google Cloud and Azure cloud platforms. Experience with managing and automating cloud resources in these environments. Monitoring & Logging:Configure and manage monitoring tools like Splunk, Grafana, and ELK for application and infrastructure health insights. GitOps:Implement GitOps practices for application and infrastructure configuration management. Scripting and Automation:Proficient in scripting languages like Python and Bash for automating tasks. Utilize Ansible or Chef for configuration management. Configuration Management:Experience with configuration management tools like Ansible and Chef. Qualifications:4-9 years of experience as a Kubernetes & DevOps Engineer or similar role with 12+ years of total experience in Cloud and Infra managed services. Strong understanding of CI/CD principles and practices. Proven experience with Jenkins or CI/CD, including pipelines, scripting, and plugins. Expertise in Terraform and IaC principles. Experience with Kubernetes management in On Prem platform is preferred. Exposure with monitoring and logging tools like Splunk, Grafana, or ELK. Experience with GitOps practices. Proficiency in scripting languages like Python and Bash. Experience with configuration management tools like Ansible or Chef. Hands-on experience with Kubernetes and Docker. Knowledge of Helm charts and their application in Kubernetes clusters. Must:Flexible to cover a part of US working Hours ( 24/7 business requirement ).Excellent communication and collaboration skills. Fluent in English.

Posted 1 month ago

Apply

3 - 8 years

3 - 7 Lacs

Bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Google Kubernetes Engine Good to have skills : Kubernetes, Google Cloud Compute Services Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education About The Role :Job Summary :We are seeking a motivated and talented GCP & Kubernetes Engineer to join our growing cloud infrastructure team. This role will be a key contributor in building and maintaining our Kubernetes platform, working closely with architects to design, deploy, and manage cloud-native applications on Google Kubernetes Engine (GKE).Responsibilities: Extensive hands-on experience with Google Cloud Platform (GCP) and Kubernetes implementations. Demonstrated expertise in operating and managing container orchestration engines such as Dockers or Kubernetes. Knowledge or experience on various Kubernetes tools like Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus Proven track record in supporting and deploying various public cloud services. Experience in building or managing self-service platforms to boost developer productivity. Proficiency in using Infrastructure as Code (IaC) tools like Terraform. Skilled in diagnosing and resolving complex issues in automation and cloud environments. Advanced experience in architecting and managing highly available and high-performance multi-zonal or multi-regional systems. Strong understanding of infrastructure CI/CD pipelines and associated tools. Collaborate with internal teams and stakeholders to understand user requirements and implement technical solutions. Experience working in GKE, Edge/GDCE environments. Assist development teams in building and deploying microservices-based applications in public cloud environments.Technical Skillset: Minimum of 3 years of hands-on experience in migrating or deploying GCP cloud-based solutions. At least 3 years of experience in architecting, implementing, and supporting GCP infrastructure and topologies. Over 3 years of experience with GCP IaC, particularly with Terraform, including writing and maintaining Terraform configurations and modules. Experience in deploying container-based systems such as Docker or Kubernetes on both private and public clouds (GCP GKE). Familiarity with CI/CD tools (e.g., GitHub) and processes.Certifications: GCP ACE certification is mandatory. CKA certification is highly desirable. HashiCorp Terraform certification is a significant plus.

Posted 1 month ago

Apply

7 - 12 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education About The Role :We are looking for an experienced Kubernetes Architect to join our growing cloud infrastructure team. This role will be responsible for architecting, designing, and implementing scalable, secure, and highly available cloud-native applications on Kubernetes. You will leverage Kubernetes along with associated technologies like Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus to build resilient systems that meet both business and technical needs. Google Kubernetes Engine (GKE) will be considered as an additional skill. As a Kubernetes Architect, you will play a key role in defining best practices, optimizing the infrastructure, and providing architectural guidance to cross-functional teams Key Responsibilities: Architect Kubernetes Solutions:Design and implement scalable, secure, and high-performance Kubernetes clusters. Cloud-Native Application Design:Collaborate with development teams to design cloud-native applications, ensuring that microservices are properly architected and optimized for Kubernetes environments. Kafka Management:Architect and manage Apache Kafka clusters using Kubekafka, ensuring reliable, real-time data streaming and event-driven architectures. Database Architecture:Use Kubegres to manage high-availability PostgreSQL clusters in Kubernetes, ensuring data consistency, scaling, and automated failover. Helm Chart Development:Create, maintain, and optimize Helm charts for consistent deployment and management of applications across Kubernetes environments. Ingress & Networking:Architect and configure Ingress controllers (e.g., NGINX, Traefik) for secure and efficient external access to Kubernetes services, including SSL termination, load balancing, and routing. Caching and Performance Optimization:Leverage Redis to design efficient caching and session management solutions, optimizing application performance. Monitoring & Observability:Lead the implementation of Prometheus for metrics collection and Grafana for building real-time monitoring dashboards to visualize the health and performance of infrastructure and applications. CI/CD Integration:Design and implement continuous integration and continuous deployment (CI/CD) pipelines to streamline the deployment of Kubernetes-based applications. Security & Compliance:Ensure Kubernetes clusters follow security best practices, including RBAC, network policies, and the proper configuration of Secrets Management. Automation & Scripting:Develop automation frameworks using tools like Terraform, Helm, and Ansible to ensure repeatable and scalable deployments. Capacity Planning and Cost Optimization:Optimize resource usage within Kubernetes clusters to achieve both performance and cost-efficiency, utilizing cloud tools and services. Leadership & Mentorship:Provide technical leadership to development, operations, and DevOps teams, offering mentorship, architectural guidance, and sharing best practices. Documentation & Reporting:Produce comprehensive architecture diagrams, design documents, and operational playbooks to ensure knowledge transfer across teams and maintain system reliability Required Skills & Experience: 10+ years of experience in cloud infrastructure engineering, with at least 5+ years of hands-on experience with Kubernetes. Strong expertise in Kubernetes for managing containerized applications in the cloud. Experience in deploying & managing container-based systems on both private and public clouds (Google Kubernetes Engine (GKE)). Proven experience with Kubekafka for managing Apache Kafka clusters in Kubernetes environments. Expertise in managing PostgreSQL clusters with Kubegres and implementing high-availability database solutions. In-depth knowledge of Helm for managing Kubernetes applications, including the development of custom Helm charts. Experience with Ingress controllers (e.g., NGINX, Traefik) for managing external traffic in Kubernetes. Hands-on experience with Redis for caching, session management, and as a message broker in Kubernetes environments. Advanced knowledge of Prometheus for monitoring and Grafana for visualization and alerting in cloud-native environments. Experience with CI/CD pipelines for automated deployment and integration using tools like Jenkins, GitLab CI, or CircleCI. Solid understanding of networking, including load balancing, DNS, SSL/TLS, and ingress/egress configurations in Kubernetes. Familiarity with Terraform and Ansible for infrastructure automation. Deep understanding of security best practices in Kubernetes, such as RBAC, Network Policies, and Secrets Management. Knowledge of DevSecOps practices to ensure secure application delivery.Certifications:oGoogle Cloud Platform (GCP) certification is mandatory.oKubernetes Certification (CKA, CKAD, or CKAD) is highly preferred.oHashiCorp Terraform certification is a significant plus.

Posted 1 month ago

Apply

3 - 8 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : GCP Dataflow Good to have skills : Google BigQuery Minimum 3 year(s) of experience is required Educational Qualification : Any Graduate Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems using GCP Dataflow. Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing using GCP Dataflow. Create data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Collaborate with cross-functional teams to identify and resolve data-related issues. Develop and maintain documentation related to data solutions and processes. Stay updated with the latest advancements in data engineering technologies and integrate innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Experience in GCP Dataflow. Good To Have Skills:Experience in Google BigQuery. Strong understanding of ETL processes and data migration. Experience in data modeling and database design. Experience in data warehousing and data lake concepts. Experience in programming languages such as Python, Java, or Scala. Additional Information: The candidate should have a minimum of 3 years of experience in GCP Dataflow. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Indore office. Qualification Any Graduate

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : Microsoft 365 Good to have skills : Microsoft PowerShell, Microsoft 365 Security & Compliance Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Integration Engineer, you will provide consultative Business and System Integration services to help clients implement effective solutions. You will understand and translate customer needs into business and technology solutions, drive discussions, consult on transformation, the customer journey, functional/application designs, and ensure technology and business solutions represent business requirements. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement integration solutions for clients. Collaborate with cross-functional teams to ensure successful project delivery. Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft 365. Good To Have Skills:Experience with Microsoft PowerShell, Microsoft 365 Security & Compliance. Strong understanding of cloud-based integration technologies. Experience in designing and implementing scalable integration solutions. Knowledge of API integration and data mapping techniques. Additional Information: The candidate should have a minimum of 3 years of experience in Microsoft 365. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

8 - 13 Lacs

Bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Kubernetes Good to have skills : Google Kubernetes Engine, Google Cloud Compute Services Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Job Summary :We are looking for an experienced Kubernetes Specialist to join our cloud infrastructure team. You will work closely with architects and engineers to design, implement, and optimize cloud-native applications on Google Kubernetes Engine (GKE). This role will focus on providing expertise in Kubernetes, container orchestration, and cloud infrastructure management, ensuring the seamless operation of scalable, secure, and high-performance applications on GKE and other cloud environments.________________________________________Responsibilities: Kubernetes Implementation:Design, implement, and manage Kubernetes clusters for containerized applications, ensuring high availability and scalability. Cloud-Native Application Design:Work with teams to deploy, scale, and maintain cloud-native applications on Google Kubernetes Engine (GKE). Kubernetes Tools Expertise:Utilize Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus to build and maintain resilient systems. Infrastructure Automation:Develop and implement automation frameworks using Terraform and other tools to streamline Kubernetes deployments and cloud infrastructure management. CI/CD Implementation:Design and maintain CI/CD pipelines to automate deployment and testing for Kubernetes-based applications. Kubernetes Networking & Security:Ensure secure and efficient Kubernetes cluster networking, including Ingress controllers (e.g., NGINX, Traefik), RBAC, and Secrets Management. Monitoring & Observability:Lead the integration of monitoring solutions using Prometheus for metrics and Grafana for real-time dashboard visualization. Performance Optimization:Optimize resource utilization within GKE clusters, ensuring both performance and cost-efficiency. Collaboration:Collaborate with internal development, operations, and security teams to meet user requirements and implement Kubernetes solutions. Troubleshooting & Issue Resolution:Address complex issues related to containerized applications, Kubernetes clusters, and cloud infrastructure, troubleshooting and resolving them efficiently.________________________________________Technical Skillset: GCP & Kubernetes Experience:Minimum of 3+ years of hands-on experience in Google Cloud Platform (GCP) and Kubernetes implementations, including GKE. Container Management:Proficiency with container orchestration engines such as Kubernetes and Docker. Kubernetes Tools Knowledge:Experience with Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus for managing Kubernetes-based applications. Infrastructure as Code (IaC):Strong experience with Terraform for automating infrastructure provisioning and management. CI/CD Pipelines:Hands-on experience in building and managing CI/CD pipelines for Kubernetes applications using tools like Jenkins, GitLab, or CircleCI. Security & Networking:Knowledge of Kubernetes networking (DNS, SSL/TLS), security best practices (RBAC, network policies, and Secrets Management), and the use of Ingress controllers (e.g., NGINX) Cloud & DevOps Tools:Familiarity with cloud services and DevOps tools such as GitHub, Jenkins, and Ansible. Monitoring Expertise:In-depth experience with Prometheus and Grafana for operational monitoring, alerting, and creating actionable insights. Certifications: Google Cloud Platform (GCP) Associate Cloud Engineer (ACE) certification is required. Certified Kubernetes Administrator (CKA) is highly preferred.

Posted 1 month ago

Apply

3 - 5 years

5 - 7 Lacs

Jaipur

Work from Office

Skill required: Procure to Pay - Invoice Processing Designation: Procure to Pay Operations Analyst Qualifications: BCom/MCom Years of Experience: 3 to 5 years What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.boosting vendor compliance, cutting savings erosion, improving discount capture using preferred suppliers, and in confirming pricing and terms prior to payment. Responsible for accounting of goods and services, through requisitioning, purchasing and receiving. Also look after order sequence of procurement and financial process end to end. The Accounts Payable Processing team focuses on designing, implementing, managing and supporting accounts payable activities by applying the relevant processes, policies and applications. The team is responsible for timely and accurate billing and processing of invoices, managing purchase and non-purchase orders and two-way and three-way matching of invoices.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Good Verbal and written Communication SkillsWell versed with Accounts payable cycle , sub processes & terminologies Good Understanding of PO vs Non PO InvoicesGood understanding of withholding taxes treatments in invoice processingGood understanding of Month end/ quarter end/ year end closing along with acccruals and reportingGood understanding of accounting Journal entriesUnderstanding of employee expense claim processingUnderstanding of expenses accruals Vendors ReconciliationsReporting and audit assignmentsWorking knowledge of MS OfficeProblem Solving attitudeTeam working and co-ordinationReady to work in night shiftsKnowledge of Invoice processing toolsKnowledge of current technologies in PTP domainAnalytical skillUnderstanding of RPAs Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualifications BCom,MCom

Posted 1 month ago

Apply

12 - 17 years

14 - 19 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark, Python (Programming Language), Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed and implemented efficiently and effectively, while meeting the needs of the organization. Your typical day will involve collaborating with the team, making team decisions, and engaging with multiple teams to contribute to key decisions. You will also be expected to provide solutions to problems that apply across multiple teams, showcasing your expertise and problem-solving skills. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Ensure efficient and effective development and implementation of applications Design and build applications to meet business process and application requirements Contribute to the decision-making process and provide valuable insights Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark Good To Have Skills:Experience with Apache Spark, Python (Programming Language), Google BigQuery Strong understanding of statistical analysis and machine learning algorithms Experience with data visualization tools such as Tableau or Power BI Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: The candidate should have a minimum of 12 years of experience in PySpark This position is based at our Bengaluru office A 15 years full time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 5 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Google BigQuery, SSI: NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements :Key Responsibilities :1:Assists with the data platform blueprint and design, encompassing the relevant data platform components.2:Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.3:The Data Engineer performs tasks such as data modeling, data pipeline build, data lake build, scalable programming frameworks Technical Experience :1:Expert in Python - NO FLEX. Strong hands-on- knowledge in SQL - NO FLEX, Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills2:Exp with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes NO FLEX3:Pro with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline Professional Attributes :1:Good communication 2:Good Leadership skills and team handling skills 3:Analytical skills, presentation skills, ability to work under pressure 4:Should be able to work in shifts whenever required Educational Qualification:Additional Info : Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Dataproc, Google Pub/Sub Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education br/>Key Responsibilities :A:Implement and maintain data engineering solutions using BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Collaborate with data scientists to deploy machine learning modelsC:Ensure the scalability and efficiency of data processing pipelines br/> Technical Experience :A:Expertise in BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Hands-on experience with data engineering in a cloud environment br/> Professional Attributes :A:Strong problem-solving skills in optimizing data workflowsB:Effective collaboration with data science and engineering teams Qualifications 15 years full time education

Posted 1 month ago

Apply

16 - 25 years

18 - 27 Lacs

Bengaluru

Work from Office

Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Sr Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationIn Artificial Intelligence, you will be enhancing business results by using AI tools and techniques to performs tasks such as visual perception, speech recognition, decision-making, and translation between languages etc. that requires human intelligence. What are we looking for? Machine Learning Process-orientation Thought leadership Commitment to quality Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 1 month ago

Apply

2 - 4 years

5 - 8 Lacs

Pune

Work from Office

We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.

Posted 1 month ago

Apply

12 - 17 years

14 - 19 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark, Python (Programming Language), Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Lead the team in implementing PySpark solutions effectively Conduct code reviews and ensure adherence to best practices Provide technical guidance and mentorship to junior team members Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark, Python (Programming Language), Apache Spark, Google BigQuery Strong understanding of distributed computing and parallel processing Experience in optimizing PySpark jobs for performance Knowledge of data processing and transformation techniques Familiarity with cloud platforms for deploying PySpark applications Additional Information: The candidate should have a minimum of 12 years of experience in PySpark This position is based at our Gurugram office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Btech Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze requirements. - Design, develop, and test applications based on business needs. - Troubleshoot and debug applications to ensure optimal performance. - Implement security and data protection measures. - Document technical specifications and user guides. - Stay up-to-date with emerging technologies and industry trends. Professional & Technical Skills: - Must To Have Skills:Proficiency in Google BigQuery. - Strong understanding of SQL and database concepts. - Experience with data modeling and schema design. - Knowledge of ETL processes and data integration techniques. - Familiarity with cloud platforms such as Google Cloud Platform. - Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Hyderabad office. - A Btech degree is required. Qualifications Btech

Posted 1 month ago

Apply

4 - 9 years

16 - 31 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects. Ensure delivery within schedule by adhering to the engineering and quality standards. Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 24x7 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working. Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Preferred candidate profile

Posted 1 month ago

Apply

7 - 12 years

13 - 17 Lacs

Gurugram

Work from Office

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 1 month ago

Apply

1 - 3 years

3 - 6 Lacs

Bengaluru

Work from Office

Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom

Posted 1 month ago

Apply

7 - 10 years

16 - 21 Lacs

Mumbai

Work from Office

Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.

Posted 1 month ago

Apply

4 - 9 years

10 - 14 Lacs

Pune

Hybrid

Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies