Home
Jobs

721 Bigquery Jobs - Page 9

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific An expert on the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Team Leadership and Management: Supervising a team of platform engineers, with a focus on team dynamics and the efficient delivery of cloud platform solutions.Technical Guidance and Decision-Making: Providing technical leadership and making pivotal decisions concerning platform architecture, tools, and processes. Balancing hands-on involvement with strategic oversight.Mentorship and Skill Development: Guiding team members through mentorship, enhancing their technical proficiencies, and nurturing a culture of continual learning and innovation in platform engineering practices.In-Depth Technical Proficiency: Possessing a comprehensive understanding of platform engineering principles and practices, and demonstrating expertise in crucial technical areas such as cloud services, automation, and system architecture.Community Contribution: Making significant contributions to the development of the platform engineering community, staying informed about emerging trends, and applying this knowledge to drive enhancements in capability.

Posted 1 week ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A strong grasp of the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Community Engagement: Actively participating in the professional data platform engineering community, sharing insights, and staying up-to-date with the latest trends and best practices.Project Contributions: Making substantial contributions to client delivery, particularly in the design, construction, and maintenance of cloud-based data platforms and infrastructure.Technical Expertise: Demonstrating a sound understanding of data platform engineering principles and knowledge in areas such as cloud data storage solutions (e.g., AWS S3, Azure Data Lake), data processing frameworks (e.g., Apache Spark), and data orchestration tools.Independent Work and Initiative: Taking ownership of independent tasks, displaying initiative and problem-solving skills when confronted with intricate data platform engineering challenges.Emerging Leadership: Commencing leadership roles, which may encompass mentoring junior engineers, leading smaller project teams, or taking the lead on specific aspects of data platform projects.

Posted 1 week ago

Apply

2.0 - 7.0 years

1 - 6 Lacs

Guwahati

Work from Office

Naukri logo

SUMMARY Job Summary: Exciting job opportunity as a Registered Nurse in Qatar (Homecare) Key Responsibilities: Develop and assess nursing care plans Monitor vital signs and assess holistic patient needs Collaborate with physicians, staff nurses, and healthcare team members Administer oral and subcutaneous medications while ensuring safety Document nursing care, medications, and procedures using the company's Nurses Buddy application Conduct client assessment and reassessment using approved tools Attend refresher training courses, seminars, and training Timeline for Migration: Application to Selection: Not more than 5 days Data flow & Prometric: 1 month Visa processing: 1-2 months Start working in Qatar within 3 months! Requirements: Educational Qualification: Bachelor's Degree in Nursing or GNM Experience: Minimum 2 years working experience as a Nurse post registration Citizenship: Indian Age limit: 18 to 40 years Certification: registration Certification from Nursing Council Language: Basic English proficiency required Technical Skills: Bed side nursing, patient care, patient assessment and monitoring Benefits: High Salary & Perks: Earn 5000 QAR / month (1,18,000 INR/month) Tax Benefit: No tax deduction on salary Career Growth: Advanced Nursing career in Qatar with competitive salaries, cutting-edge facilities, and opportunities for specialization Relocation support: Visa process and flight sponsored. Free accommodation and transportation provided. International Work Experience: Boost your resume with International healthcare expertise. Comprehensive Health Insurance: Medical coverage for under Qatar’s healthcare system. Safe and stable environment: Qatar is known for its low crime rate, political stability, and high quality of life. The strict laws in the country, makes it one of safest place to live. Faster Visa Processing With efficient government procedures, work visas for nurses are processed quickly, reducing waiting times. Simplified Licensing Process Compared to other countries, Qatar offers a streamlined process for obtaining a nursing license through QCHP (Qatar Council for Healthcare Practitioners) . Direct Hiring Opportunities Many hospitals and healthcare facilities offer direct recruitment , minimizing third-party delays and complications. Limited slots available! Apply now to secure your place in the next batch of Nurses migrating to Qatar!

Posted 1 week ago

Apply

8.0 - 13.0 years

27 - 42 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title: Senior GCP Data Engineer Experience: 8 to 13 years Key Responsibilities : Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) . Develop ETL/ELT workflows using Cloud Dataflow , Apache Beam , Dataproc , BigQuery , and Cloud Composer (Airflow). Optimize performance of data processing and storage solutions (e.g., BigQuery, Cloud Storage). Collaborate with data analysts, data scientists, and business stakeholders to deliver data-driven insights. Design and implement data lake and data warehouse solutions following best practices. Ensure data quality, security, and governance across GCP environments. Implement CI/CD pipelines for data engineering workflows using tools like Cloud Build , GitLab CI , or Jenkins . Monitor and troubleshoot data jobs, ensuring reliability and timeliness of data delivery. Mentor junior engineers and participate in architectural design discussions. Technical Skills: Strong experience in Google Cloud Platform (GCP) data services: BigQuery , Dataflow , Dataproc , Pub/Sub , Cloud Storage , Cloud Functions Proficiency in Python and/or Java for data processing. Strong knowledge of SQL and performance tuning in large-scale environments. Hands-on experience with Apache Beam , Apache Spark , and Airflow . Solid understanding of data modeling , data warehousing , and streaming/batch processing . Experience with CI/CD , Git, and modern DevOps practices for data workflows. Familiarity with data security and compliance in cloud environments. NOTE : Only immediate and 15 days joiners Notice period : Only immediate and 15 days joiners Location: Pune, Chennai. Hyderabad, Kolkata Mode of Work : WFO(Work From Office) Thanks & Regards, SWETHA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,INDIA. Contact Number:8067432433 rathy@blackwhite.in |www.blackwhite.in

Posted 1 week ago

Apply

6.0 - 10.0 years

10 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Description: : GCP Data Engineer With Big Query Experience Required: 5+ years Location : PAN India (Hybrid) Notice Period : Immediate Mandatory Skills(ONLY 2 or 3) GCP Big Query with Strong SQL Experience Detailed job description - Skill Set: Designing, developing and maintaining data models and transformations using DBT Data Build Tool to ensure efficient and accurate data consumption for analytics and reporting Manage, monitor and ensure the security and privacy of data to satisfy business needs Knowledgeable in PowerShell scripting Creation of DAGs to migrate the logic from GCP Big Query to DBT Designing, building and optimizing data solutions using GCP Big Query platform including creating ETL pipelines Unit and Integration Testing activities Strong hands-on exposure to GCP Big Query

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Remote

Naukri logo

Canterr is looking for talented and passionate professionals for exciting opportunities with a US-based MNC product company! You will be working permanently with Canterr and deployed to a top-tier global tech client. Key Responsibilities: Design and develop data pipelines and ETL processes to ingest, process, and store large volumes of data. Implement and manage big data technologies such as Kafka, Dataflow, BigQuery, CloudSQL, Kafka, PubSub Collaborate with stakeholders to understand data requirements and deliver high-quality data solutions. Monitor and troubleshoot data pipeline issues and implement solutions to prevent future occurrences. Required Skills and Experience: Generally, we use Google Cloud Platform (GCP) for all software deployed at Wayfair. Data Storage and Processing BigQuery CloudSQL PostgreSQL DataProc Pub/Sub Data modeling: Breaking the business requirements (KPIs) to data points. Building the scalable data model ETL Tools: DBT SQL Data Orchestration and ETL Dataflow Cloud Composer Infrastructure and Deployment Kubernetes Helm Data Access and Management Looker Terraform Ideal Business Domain Experience: Supply chain or warehousing experience: The project is focused on building a normalized data layer which ingests information from multiple Warehouse Management Systems (WMS) and projects it for back-office analysis

Posted 1 week ago

Apply

12.0 - 20.0 years

35 - 50 Lacs

Bengaluru

Hybrid

Naukri logo

Data Architect with Cloud Expert, Data Architecture, Data Integration & Data Engineering ETL/ELT - Talend, Informatica, Apache NiFi. Big Data - Hadoop, Spark Cloud platforms (AWS, Azure, GCP), Redshift, BigQuery Python, SQL, Scala,, GDPR, CCPA

Posted 1 week ago

Apply

5.0 - 9.0 years

14 - 17 Lacs

Pune

Work from Office

Naukri logo

Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers

Posted 1 week ago

Apply

9.0 - 14.0 years

25 - 40 Lacs

Noida

Hybrid

Naukri logo

Experience Aplenty: 8+ years of hands-on experience in applicable software development environments, showcasing your prowess and ability to excel. Educational Symphony: A Bachelor's degree is strongly preferred, demonstrating your commitment to continuous learning and growth. Tech Savvy: Should have demonstrated experience in Cloud environments like AWS, GCP, or Azure. Comparable knowledge of tools like Azure Pipelines, BigQuery, MFT, Vault, SSIS, SSRS, SQL, & Google DataFlow. Workflow management and orchestration tools such as Airflow. Experience with object function/object-oriented scripting languages including Java and Python. Working knowledge of Snowflake and Google DataFlow a definite plus! Business Acumen: Translate business needs into technical requirements with finesse, showcasing your ability to balance technical excellence with customer satisfaction. Team Player: Collaborate seamlessly with the team, responding to requests in a timely manner, meeting individual commitments, and contributing to the collective success. Mentor Extraordinaire: Leverage your coaching and teaching skills to guide and mentor your fellow team members, fostering an environment of continuous improvement. Please apply here also- https://corelogic.wd5.myworkdayjobs.com/ACQ/job/Noida-Uttar-Pradesh/Lead-Data-Engineer_REQ15827

Posted 1 week ago

Apply

15.0 - 20.0 years

100 - 200 Lacs

Bengaluru

Hybrid

Naukri logo

What Youll Do: Play a key role in developing and driving a multi-year technology strategy for a complex platform Directly and indirectly manage several senior software engineers (architects) and managers by providing coaching, guidance, and mentorship to grow the team as well as individuals Lead multiple software development teams - architecting solutions at scale to empower the business, and owning all aspects of the SDLC: design, build, deliver, and maintain Inspire, coach, mentor, and support your team members in their day to day work and their long term professional growth Attract, onboard, develop and retain diverse top talents, while fostering an inclusive and collaborative team and culture (our latest DEI Report) Lead your team and peers by example. As a senior member of the team your methodologies, technical and operational excellence practices, and system designs will help to continuously improve our domain Identify, propose, and drive initiatives to advance the technical skills, standards, practices, architecture, and documentation of our engineering teams Facilitate technical debate and decision making with an appreciation for trade-offs Continuously rethink and push the status quo, even when it challenges your/our established ideas. Preferred candidate profile Results-oriented, collaborative, pragmatic, and continuous improvement mindset Hands-on experience driving software transformations within high-growth environments (think complex, cross-continentally owned products) 15+ years of experience in engineering, out of which at least 10 years spent in leading highly performant teams and their managers (please note that a minimum of 5 years in leading fully fledged managers is required) Experience making architectural and design-related decisions for large scale platforms, understanding the tradeoffs between time-to-market vs. flexibility Significant experience and vocation in managing and enabling peoples growth and performance Experience designing and building high-scale generalizable products with outstanding user experience. Practical experience in hiring and developing engineering teams and culture and leading interdisciplinary teams in a fast-paced agile environment Capability to communicate and collaborate across the wider organization, influencing decisions with and without direct authority and always with inclusive, adaptable, and persuasive communication Analytical and decision-making skills that integrate technical and business requirements

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary This position provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. He/She directs component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. He/She guides teams to ensure effective communication and achievement of objectives. This position researches and supports the integration of emerging technologies. He/She provides knowledge and support for applications development, integration, and maintenance. This position leads junior team members with project related activities and tasks. He/She guides and influences department and project teams. This position facilitates collaboration with stakeholders. Responsibilities: GCP services (like Big Query, GKE, Spanner, Cloud run, Data flow etc.,) Angular, Java (Rest Api), SQL, Python, Terraforms, Azure DevOps CICD Pipelines Leads systems analysis and design. Leads design and development of applications. Develops and ensures creation of application documents. Defines and produces integration builds. Monitors emerging technology trends. Leads maintenance and support. Primary Skills: Minimum 5 years Java-Springboot/J2EE (Full Stack Developer) Minimum 2 years in GCP platform (Cloud PubSub, GKE, BigQuery) - Experience in BigTable and Spanner will be a plus Working in Agile environment, CI/CD experience. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

3.0 - 6.0 years

0 - 0 Lacs

Chennai

Work from Office

Naukri logo

Job Description: We are seeking a highly skilled Data Engineer with a minimum of 3 years of experience in analytical thinking and working with complex data sets. The ideal candidate will have a strong background in SQL, BigQuery, and Google Cloud Platform (GCP), with hands-on experience in developing reports and dashboards using Looker Studio, Looker Standard, and LookML. Excellent communication skills and the ability to work collaboratively with cross-functional teams are essential for success in this role. Key Responsibilities: Design, develop, and maintain dashboards and reports using Looker Studio and Looker Standard. Develop and maintain LookML models, explores, and views to support business reporting requirements. Optimize and write advanced SQL queries for data extraction, transformation, and analysis. Work with BigQuery as the primary data warehouse for managing and analyzing large datasets. Collaborate with business stakeholders to understand data requirements and translate them into scalable reporting solutions. Implement data governance, access controls, and performance optimizations within the Looker environment. Perform root-cause analysis and troubleshooting for reporting and data issues. Maintain documentation for Looker projects, data models, and data dictionaries. Stay updated with the latest Looker and GCP features and best practices. Qualifications: Minimum of 3 years of experience in data analysis, with a focus on analytical thinking and working with complex data sets. Proven experience in creating data stories and understanding metric and channel trends. Strong proficiency in SQL, BigQuery, and Google Cloud Platform (GCP). Hands-on experience with Looker Studio, Looker Standard, and LookML. Excellent communication skills and the ability to work collaboratively with cross-functional teams. Strong problem-solving skills and attention to detail. Ability to manage multiple tasks and projects simultaneously.

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Role: Site Reliability Engineers (SREs) in Google Cloud Platform (GCP) and RedHat OpenShift administration. Responsibilities: System Reliability: Ensure the reliability and uptime of critical services and infrastructure. Google Cloud Expertise: Design, implement, and manage cloud infrastructure using Google Cloud services. Automation: Develop and maintain automation scripts and tools to improve system efficiency and reduce manual intervention. Monitoring and Incident Response: Implement monitoring solutions and respond to incidents to minimize downtime and ensure quick recovery. Collaboration: Work closely with development and operations teams to improve system reliability and performance. Capacity Planning: Conduct capacity planning and performance tuning to ensure systems can handle future growth. Documentation: Create and maintain comprehensive documentation for system configurations, processes, and procedures. Qualifications: Education: Bachelors degree in computer science, Engineering, or a related field. Experience: 4+ years of experience in site reliability engineering or a similar role. Skills: Proficiency in Google Cloud services (Compute Engine, Kubernetes Engine, Cloud Storage, BigQuery, Pub/Sub, etc.). Familiarity with Google BI and AI/ML tools (Looker, BigQuery ML, Vertex AI, etc.) Experience with automation tools (Terraform, Ansible, Puppet). Familiarity with CI/CD pipelines and tools (Azure pipelines Jenkins, GitLab CI, etc.). Strong scripting skills (Python, Bash, etc.). Knowledge of networking concepts and protocols. Experience with monitoring tools (Prometheus, Grafana, etc.). Preferred Certifications: Google Cloud Professional DevOps Engineer Google Cloud Professional Cloud Architect Red Hat Certified Engineer (RHCE) or similar Linux certification

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

At F5, we strive to bring a better digital world to life. Our teams empower organizations across the globe to create, secure, and run applications that enhance how we experience our evolving digital world. We are passionate about cybersecurity, from protecting consumers from fraud to enabling companies to focus on innovation. Job Summary: We are seeking a skilled and driven Data Analyst Business Intelligence to join our global Services organization, supporting Customer Success and Renewals. This role is essential to enabling data-driven decision-making across a worldwide team by transforming complex, multi-source datasets into strategic insights. The ideal candidate will bring 5+ years of experience in data analysis, reporting, and business intelligence, with a demonstrated ability to work with large, complex datasets from diverse repositories. This individual will proactively identify data gaps, propose and implement solutions, and synthesize improved data with industry knowledge to deliver high-impact recommendations to business leaders. Success in this role means accelerating decision-making, improving operational efficiency, and uncovering opportunities that drive customer satisfaction, revenue retention, and long-term growth. Key Responsibilities: Analyze global Services Renewals data to uncover trends, forecast performance, and support revenue optimization strategies. Design, build, and maintain dashboards and reports that surface key performance indicators (KPIs) related to renewals, churn, upsell, and customer retention. Collaborate cross-functionally with Renewals, Sales, Customer Success, and Finance teams to deliver insights that improve forecasting accuracy and operational execution. Manage an intake queue for ad hoc and strategic data requests, partnering with business leaders to clarify needs, propose analytical approaches, and drive solutions through to delivery. Support weekly and quarterly business reviews by delivering timely, accurate reporting and insight packages that inform executive decision-making. Work with large, complex datasets from multiple systems, ensuring data integrity, consistency, and usability across platforms. Proactively identify data gaps and quality issues, propose solutions, and lead remediation efforts to enhance analytical accuracy and business impact. Continuously explore data to uncover new opportunities, develop hypotheses, and recommend strategies that improve customer retention and revenue performance. Leverage BI tools (e.g., Power BI, Tableau, Looker) and SQL to automate reporting, streamline workflows, and scale analytics capabilities. Contribute to the development and refinement of predictive models that assess customer renewal behavior and risk indicators. Identify opportunities to apply Artificial Intelligence (AI) and machine learning tools to enhance forecasting, automate insights, and optimize customer success strategies. Stay current on emerging AI technologies and proactively recommend innovative solutions that improve analytical efficiency, insight generation, and strategic decision-making. Skills / Knowledge / Abilities: Advanced proficiency in SQL and data visualization tools such as Power BI, Tableau, and Looker , with the ability to build scalable, user-friendly dashboards. Proven experience extracting, transforming, and analyzing large, complex datasets from multiple systems, ensuring data quality and consistency. Strong analytical thinking and problem-solving skills, with a proactive mindset for uncovering insights and driving business outcomes. Demonstrated ability to build and apply predictive models to assess customer behavior, renewal likelihood, and churn risk, using statistical or machine learning techniques. Ability to translate data into strategic recommendations , combining analytical rigor with business acumen and industry context. Experience supporting Customer Success, Renewals, or subscription-based business models ; familiarity with churn, retention, and upsell analytics is highly preferred. Effective communicator with the ability to present insights clearly to both technical and non-technical stakeholders, including senior leadership. Skilled in managing multiple priorities in a fast-paced, cross-functional environment , with a strong sense of ownership and accountability. Familiarity with CRM and ERP systems such as Salesforce, Oracle, or SAP. Working knowledge of data warehousing and cloud platforms (e.g., Snowflake, BigQuery, Azure) Ability to identify and apply AI and machine learning tools to enhance forecasting, automate insights, and improve strategic decision-making. Qualifications: Bachelors degree in Business, Data Analytics, Statistics, Computer Science, or related field. 5+ years of relevant experience in data analytics, preferably in services, subscription, or renewals-focused environment The is intended to be a general representation of the responsibilities and requirements of the job. However, the description may not be all-inclusive, and responsibilities and requirements are subject to change. Please note that F5 only contacts candidates through F5 email address (ending with @f5.com) or auto email notification from Workday (ending with f5.com or @myworkday.com ) . Equal Employment Opportunity It is the policy of F5 to provide equal employment opportunities to all employees and employment applicants without regard to unlawful considerations of race, religion, color, national origin, sex, sexual orientation, gender identity or expression, age, sensory, physical, or mental disability, marital status, veteran or military status, genetic information, or any other classification protected by applicable local, state, or federal laws. This policy applies to all aspects of employment, including, but not limited to, hiring, job assignment, compensation, promotion, benefits, training, discipline, and termination. F5 offers a variety of reasonable accommodations for candidates . Requesting an accommodation is completely voluntary. F5 will assess the need for accommodations in the application process separately from those that may be needed to perform the job. Request by contacting accommodations@f5.com.

Posted 1 week ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

The Manager, Software Development Engineering leads a team of technical experts in successfully executing technology projects and solutions that align with the strategy and have broad business impact. The Manager, Software Development Engineering will work closely with development teams to identify and understand key features and their underlying functionality while also partnering closely with Product Management and UX Design. They may exercise influence and govern overall end-to-end software development life cycle related activities including management of support and maintenance releases, minor functional releases, and major projects. The Manager, Software Development Engineering will lead & provide technical guidance for process improvement programs while leveraging engineering best practices. In this people leadership role, Managers will recruit, train, motivate, coach, grow and develop Software Development Engineer team members at a variety of levels through their technical expertise and providing continuous feedback to ensure employee expectations, customer needs and product demands are met. About the Role: Lead and manage a team of engineers, providing mentorship and fostering a collaborative environment. Design, implement, and maintain scalable data pipelines and systems to support business analytics and data science initiatives. Collaborate with cross-functional teams to understand data requirements and ensure data solutions align with business goals. Ensure data quality, integrity, and security across all data processes and systems. Drive the adoption of best practices in data engineering, including coding standards, testing, and automation. Evaluate and integrate new technologies and tools to enhance data processing and analytics capabilities. Prepare and present reports on engineering activities, metrics, and project progress to stakeholders. About You: Proficiency in programming languages such as Python, Java, or Scala. Data Engineering with API & any programming language. Strong understanding of APIs and possess forward-looking knowledge of AI/ML tools or models and need to have some knowledge on software architecture. Experience with cloud platforms (e.g., AWS,Google Cloud) and big data technologies (e.g., Hadoop, Spark). Experience with Rest/Odata API's Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication and interpersonal skills. Experience with data warehousing solutions such as BigQuery or snowflakes Familiarity with data visualization tools and techniques. Understanding of machine learning concepts and frameworks. #LI-AD2 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 week ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Noida

Work from Office

Naukri logo

: Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm’s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the team: SAP Support team is facilitate to existing ERP support and new implement processes for all the Paytm entities. About the role: 1. Interacting with users on a day-to-day basis for a timely closing of the tickets. 2. Assisting users in Period end activities. 3. Preparation of functional specifications and interacting with Abapers to develop. 4. Attending and Conducting the Weekly Meetings with the SAP and Core team members. 5. Proactively discuss critical issues with other consultants 6. Engage with various Business & Technology Teams within Paytm to identify common bottlenecks esp. on Technology front. 7. Track Program/Project performance, specifically to analyze the successful completion of short- and long-term goals. 8. Enable and encourage the use of common services to increase the speed of development and execution. 9. Working on Projects of SAP sub-modules of FICOGeneral Ledger (FI-GL) , Accounts Payable (FI-AP). Accounts Receivable (FI-AR), Asset Accounting (FI-AA). 10. Integration of FI-MM & FI-SD Module 11. Resolving the issues within the time limit specified in SLAs. Expectations/ : - Minimum 8 years of experience in SAP sub Modules of FICO - Program manages initiatives that are driven centrally for Technology improvements - Aware of Building the Functional Specification and Testing process. - Experience in at least one full cycle implementation project, Experience in the process design, configuration, support, and troubleshooting of S/4HANA across multiple industries. - Awareness of business function of the GRDC, SAC, SD & MM modules and development lifecycle for ABAP & Fiori developments with RICEFs. - Aware of Building the Functional Specification and Testing process. -High level of drive, initiative and self-motivation - Ability to take internal and external stakeholders along. EducationCA, MBA in finance, BTech is preffered OR Any Graduate. Why join us: Because you get an opportunity to make a difference and have a great time doing that. You are challenged and encouraged here to do stuff that is meaningful for you and for those we serve. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be. Compensation: If you are the right fit, we believe in creating wealth for you with enviable 500 mn+ registered users, 21 mn+ merchants, and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants – and we are committed to it. India’s largest digital lending story is brewing here. It’s your opportunity to be a part of the story!

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary: UPS Enterprise Data Analytics team is looking for a talented and motivated Data Scientist to use statistical modelling, state of the art AI tools and techniques to solve complex and large-scale business problems for UPS operations. This role would also support debugging and enhancing existing AI applications in close collaboration with the Machine Learning Operations team. This position will work with multiple stakeholders across different levels of the organization to understand the business problem, develop and help implement robust and scalable solutions. You will be in a high visibility position with the opportunity to interact with the senior leadership to bring forth innovation within the operational space for UPS. Success in this role requires excellent communication to be able to present your cutting-edge solutions to both technical and business leaderships. Responsibilities: Become a subject matter expert on UPS business processes and data to help define and solve business needs using data, advanced statistical methods and AI Be actively involved in understanding and converting business use cases to technical requirements for modelling. Query, analyze and extract insights from large-scale structured and unstructured data from different data sources utilizing different platforms, methods and tools like BigQuery, Google Cloud Storage, etc. Understand and apply appropriate methods for cleaning and transforming data, engineering relevant features to be used for modelling. Actively drive modelling of business problem into ML/AI models, work closely with the stakeholders for model evaluation and acceptance. Work closely with the MLOps team to productionize new models, support enhancements and resolving any issues within existing production AI applications. Prepare extensive technical documentation, dashboards and presentations for technical and business stakeholders including leadership teams. Qualifications Expertise in Python, SQL. Experienced in using data science-based packages like scikit-learn, numpy, pandas, tensorflow, keras, statsmodels, etc. Strong understanding of statistical concepts and methods (like hypothesis testing, descriptive stats, etc.), machine learning techniques for regression, classification, clustering problems, including neural networks and deep learning. Proficient in using GCP tools like Vertex AI, BigQuery, GCS, etc. for model development and other activities in the ML lifecycle. Strong ownership and collaborative qualities in the relevant domain. Takes initiative to identify and drive opportunities for improvement and process streamline. Solid oral and written communication skills, especially around analytical concepts and methods. Ability to communicate data through a story framework to convey data-driven results to technical and non-technical audience. Masters Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications NLP, Gen AI, LLM knowledge/experience Knowledge of Operations Research methodologies and experience with packages like CPLEX, PULP, etc. Knowledge and experience in MLOps principles and tools in GCP. Experience working in an Agile environment, understanding of Lean Agile principles.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary This position provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. He/She directs component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. He/She guides teams to ensure effective communication and achievement of objectives. This position researches and supports the integration of emerging technologies. He/She provides knowledge and support for applications development, integration, and maintenance. This position leads junior team members with project related activities and tasks. He/She guides and influences department and project teams. This position facilitates collaboration with stakeholders. Responsibilities: Leads systems analysis and design. Leads design and development of applications. Develops and ensures creation of application documents. Defines and produces integration builds. Monitors emerging technology trends. Leads maintenance and support. Primary Skills: Minimum 5 years Java-Springboot/J2EE (Full Stack Developer) Minimum 2 years in Angular, GCP platform (Cloud PubSub, GKE, BigQuery). Working in Agile environment, CI/CD experience. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

Naukri logo

Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via -2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at least: Spark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Google Cloud Infrastructure Support Engineer will be responsible for ensuring the reliability, performance, and security of our Google Cloud Platform (GCP) infrastructure. Work closely with cross-functional teams to troubleshoot issues, optimize infrastructure, and implement best practices for cloud architecture. Experience with Terraform for deploying and managing infrastructure templates. Administer BigQuery environments, including managing datasets, access controls, and optimize query performance. Be familiar with Vertex AI for monitoring and managing machine learning model deployments. Knowledge of GCPs Kubernetes Engine and its integration with the cloud ecosystem. Understanding of cloud security best practices and experience with implementing security measures. Knowledge of setting up and managing data clean rooms within BigQuery. Understanding of the Analytics Hub platform and how it integrates with data clean rooms to facilitate sensitive data-sharing use cases. Knowledge of DataPlex and how it integrates with other Google Cloud services such as BigQuery, Dataproc Metastore, and Data Catalog. Key Responsibilities: Provide technical support for our Google Cloud Platform infrastructure, including compute, storage, networking, and security services. Monitor system performance and proactively identify and resolve issues to ensure maximum uptime and reliability. Collaborate with cross-functional teams to design, implement, and optimize cloud infrastructure solutions. Automate repetitive tasks and develop scripts to streamline operations and improve efficiency. Document infrastructure configurations, processes, and procedures. Qualifications: Required: Strong understanding of GCP services, including Compute Engine, Kubernetes Engine, Cloud Storage, VPC networking, and IAM. Experience with BigQuery and VertexAI Proficiency in scripting languages such as Python, Bash, or PowerShell. Experience with infrastructure as code tools such as Terraform or Google Deployment Manager. Strong communication and collaboration skills. Bachelor's Degree in Computer Science or related discipline, or the equivalent in education and work experience Preferred: Google Cloud certification (e.g., Google Cloud Certified - Professional Cloud Architect, Google Cloud Certified - Professional Cloud DevOps Engineer)

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Python and Spark technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Big Data platform for at least 5 years. Hands own experience in Spark (Hive, Impala). Hands own experience in Python Programming language. Preferably, experience in BigQuery , Dataproc , Composer , Terraform , GKE , Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of DevOps. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platforms: OpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members.

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Naukri logo

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Abirami from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to abirami.rsk@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online video test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Abirami Getronics Recruitment team

Posted 1 week ago

Apply

1.0 - 6.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

SUMMARY Job Title: Technical Support Service Desk Location: Coimbatore Salary: Up to 9 LPA Work Mode: Work from Office Shift: Rotational (5 Days Working, 2 Days Off) Key Responsibilities: Provide tech support via email, chat, and phone Resolve issues in web protocols, networking, system admin (Windows/Linux), APIs, SQL, and email delivery Analyze logs and use CLI for troubleshooting Document cases accurately Handle escalations with internal teams Improve support processes and knowledge base Mentor junior staff Participate in on-call support Requirements Familiarity with Google Workspace (GWS) and Google Cloud Platform (GCP) . Experience with BigQuery , cloud migration tools/processes. Exposure to scripting languages like Python, JavaScript, HTML . Relevant certifications are a plus: CompTIA Network+, Security+, Linux+ Microsoft Certified: Azure Administrator Associate Google Cloud Certified Associate Cloud Engineer Required Qualifications: Bachelor's Degree in Computer Science / IT / Engineering. 5 6 years of experience in technical customer support. Strong analytical, troubleshooting, and communication skills.

Posted 1 week ago

Apply

1.0 - 6.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

SUMMARY Job Title: Technical Support Service Desk Location: Hyderabad Salary: Up to 9 LPA Work Mode: Work from Office Shift: Rotational (5 Days Working, 2 Days Off) Key Responsibilities: Provide tech support via email, chat, and phone Resolve issues in web protocols, networking, system admin (Windows/Linux), APIs, SQL, and email delivery Analyze logs and use CLI for troubleshooting Document cases accurately Handle escalations with internal teams Improve support processes and knowledge base Mentor junior staff Participate in on-call support Requirements Familiarity with Google Workspace (GWS) and Google Cloud Platform (GCP) . Experience with BigQuery , cloud migration tools/processes. Exposure to scripting languages like Python, JavaScript, HTML . Relevant certifications are a plus: CompTIA Network+, Security+, Linux+ Microsoft Certified: Azure Administrator Associate Google Cloud Certified Associate Cloud Engineer Required Qualifications: Bachelor's Degree in Computer Science / IT / Engineering. 5 6 years of experience in technical customer support. Strong analytical, troubleshooting, and communication skills.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

We are looking for an experienced and motivated Senior GCP Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and optimizing data pipelines, implementing advanced analytics solutions, and maintaining robust data infrastructure using Google Cloud Platform (GCP) services. You will play a key role in enabling data-driven decision-making and enhancing the performance and scalability of our data ecosystem. Key Responsibilities: Design, implement, and optimize data pipelines using Google Cloud Platform (GCP) services, including Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Lead the design and optimization of schema for large-scale data systems, ensuring data consistency, integrity, and scalability. Work closely with cross-functional teams to understand data requirements and deliver efficient, high-performance solutions. Design and execute complex SQL queries for BigQuery and other databases, ensuring optimal performance and efficiency. Implement efficient data processing workflows and streaming data solutions using Cloud Pub/Sub and Dataflow . Develop and maintain data models, schemas, and data marts to ensure consistency and scalability across datasets. Ensure the scalability, reliability, and security of cloud-based data architectures. Optimize cloud storage, compute, and query performance, driving cost-effective solutions. Collaborate with data scientists, analysts, and software engineers to create actionable insights and drive business outcomes. Implement best practices for data management, including governance, quality, and monitoring of data pipelines. Provide mentorship and guidance to junior data engineers and collaborate with them to achieve team goals. Required Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience). 5+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP) . Extensive hands-on experience with GCP Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Strong expertise in SQL for query optimization and performance tuning in large-scale datasets. Solid experience in designing data schemas, data pipelines, and ETL processes. Strong understanding of data modeling techniques, and experience with schema design for both transactional and analytical systems. Proven experience optimizing BigQuery performance, including partitioning, clustering, and cost optimization strategies. Experience with managing and processing streaming data and batch data processing workflows. Knowledge of AlloyDB for managing transactional databases in the cloud and integrating them into data pipelines. Familiarity with data security, governance, and compliance best practices on GCP. Excellent problem-solving skills, with the ability to troubleshoot complex data issues and find efficient solutions. Strong communication and collaboration skills, with the ability to work with both technical and non-technical stakeholders. Preferred Qualifications: Bachelor's/Masters degree in Computer Science, Data Engineering, or a related field. Familiarity with infrastructure as code tools like Terraform or Cloud Deployment Manager . GCP certifications (e.g., Google Cloud Professional Data Engineer or Cloud Architect ).

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies