Jobs
Interviews

25 Gcs Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You will be part of a dynamic team at Equifax, where we are seeking creative, high-energy, and driven software engineers with hands-on development skills to contribute to various significant projects. As a software engineer at Equifax, you will have the opportunity to work with cutting-edge technology alongside a talented group of engineers. This role is perfect for you if you are a forward-thinking, committed, and enthusiastic individual who is passionate about technology. Your responsibilities will include designing, developing, and operating high-scale applications across the entire engineering stack. You will be involved in all aspects of software development, from design and testing to deployment, maintenance, and continuous improvement. By utilizing modern software development practices such as serverless computing, microservices architecture, CI/CD, and infrastructure-as-code, you will contribute to the integration of our systems with existing internal systems and tools. Additionally, you will participate in technology roadmap discussions and architecture planning to translate business requirements and vision into actionable solutions. Working within a closely-knit, globally distributed engineering team, you will be responsible for triaging product or system issues and resolving them efficiently to ensure the smooth operation and quality of our services. Managing project priorities, deadlines, and deliverables will be a key part of your role, along with researching, creating, and enhancing software applications to advance Equifax Solutions. To excel in this position, you should have a Bachelor's degree or equivalent experience, along with at least 7 years of software engineering experience. Proficiency in mainstream Java, SpringBoot, TypeScript/JavaScript, as well as hands-on experience with Cloud technologies such as GCP, AWS, or Azure, is essential. You should also have a solid background in designing and developing cloud-native solutions and microservices using Java, SpringBoot, GCP SDKs, and GKE/Kubernetes. Experience in deploying and releasing software using Jenkins CI/CD pipelines, infrastructure-as-code concepts, Helm Charts, and Terraform constructs is highly valued. Moreover, being a self-starter who can adapt to changing priorities with minimal supervision could set you apart in this role. Additional advantageous skills include designing big data processing solutions, UI development, backend technologies like JAVA/J2EE and SpringBoot, source code control management systems, build tools, working in Agile environments, relational databases, and automated testing. If you are ready to take on this exciting opportunity and contribute to Equifax's innovative projects, apply now and be part of our team of forward-thinking software engineers.,

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

If you are seeking an opportunity in the Service Sale Field for Measurement System & Solution products, Emerson has an exciting role for you! As a Service Sales Engineer, your responsibilities will include generating spares opportunities, framing Annual Maintenance contracts, and managing post-sales activities such as Shutdown jobs and Offshore contracts for field products. You will be accountable for the business growth of MSOL LCS in South India. Your key responsibilities will involve assisting customers in selecting spares and services based on their installed base and budget requirements, proactively following up on quotations, responding to customer inquiries promptly, and ensuring purchase orders align with the proposed solutions. Additionally, you will maintain accurate records in CRM, conduct site walk activities for lead generation, and deliver presentations to showcase Emerson's service strengths. To excel in this role, you must possess good technical knowledge in Level, Flow, Wireless, and Corrosion technologies. Previous experience with products like Radar level, Coriolis flow meters, and Flow computers is essential. Strong presentation and communication skills are also required. Ideally, you should hold a Bachelor's degree in Electronics or Instrumentation Engineering and have 4-6 years of sales experience in related fields. Experience with Analytical Systems for the Power and Oil & Gas Industry, as well as familiarity with Emerson field instruments, will be advantageous. At Emerson, you will have the opportunity to contribute meaningfully through your work. Our compensation and benefits packages are competitive, and we provide comprehensive medical and insurance coverage. We are dedicated to fostering a diverse and inclusive workplace and offer Work Authorization Sponsorship for foreign nationals. We prioritize the development and well-being of our employees, promoting a hybrid work setup for eligible roles to support Work-Life Balance. Safety is a top priority, and we are committed to providing a safe working environment globally. Join us at Emerson and be part of an organization that values its people and their growth, creating a workplace where everyone can thrive and succeed.,

Posted 5 days ago

Apply

4.0 - 8.0 years

8 - 24 Lacs

Bengaluru, Karnataka, India

On-site

Hi, Exp: 4-8 Years NP: Immediate to 15 Days Location: Pune/Bangalore GCP Core Service : IAM, VPC, GCE ( Google Compute Engine) , GCS ( Google Cloud Storage) , CloudSQL, MySQL, CI/CD Tool (Code Build/GitHub Action/), Other Tool : GitHub, Terraform, Shell Script, Ansible.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Cloud Engineering Team Leader at GlobalLogic, you will be responsible for providing technical guidance and career development support to a team of cloud engineers. You will define cloud architecture standards and best practices across the organization, collaborating with senior leadership to develop a cloud strategy aligned with business objectives. Your role will involve driving technical decision-making for complex cloud infrastructure projects, establishing and maintaining cloud governance frameworks, and operational procedures. With a background in technical leadership roles managing engineering teams, you will have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential. Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. You will leverage your 10+ years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services to architect sophisticated cloud solutions using Python and advanced GCP services. Leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services will be part of your responsibilities. Ensuring optimal performance and scalability of complex integrations with multiple data sources and systems, implementing security best practices and compliance frameworks, and troubleshooting and resolving technical issues will be key aspects of your role. Your technical skills will include expert-level proficiency in Python with experience in additional languages, deep expertise with GCP services such as Dataflow, Compute Engine, BigQuery, Cloud Functions, and others, advanced knowledge of Docker, Kubernetes, and container orchestration patterns, extensive experience in cloud security, proficiency in Infrastructure as Code tools like Terraform, Cloud Deployment Manager, and CI/CD experience with advanced deployment pipelines and GitOps practices. As part of the GlobalLogic team, you will benefit from a culture of caring, continuous learning and development opportunities, interesting and meaningful work, balance and flexibility in work arrangements, and being part of a high-trust organization. You will have the chance to work on impactful projects, engage with collaborative teammates and supportive leaders, and contribute to shaping cutting-edge solutions in the digital engineering domain.,

Posted 1 week ago

Apply

4.0 - 6.0 years

8 - 14 Lacs

Chennai

Work from Office

3+ years of experience in Python software development • 3+ years experience in Cloud technologies & services, preferably GCP • 3+ years of experience of practicing statistical methods and their accurate application e.g. ANOVA, principal component analysis, correspondence analysis, k-means clustering, factor analysis, multi-variate analysis, Neural Networks, causal inference, Gaussian regression, etc. • 3+ years experience with Python, SQL, BQ. • Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Vertex AI, Airflow, TensorFlow, etc., • Experience in Train, Build and Deploy ML, DL Models • Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. • Ability to adapt quickly with opensource products & tools to integrate with ML Platforms • Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) • Developing and deploying On-Prem & Cloud environments • Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Preferred candidate profile ML, Python, SQL, BQ, Tekton, terraform, GCS, GCP Looker, Vertex AI, Airflow, TensorFlow Chennai Location Onlly 4 Years

Posted 1 week ago

Apply

8.0 - 12.0 years

18 - 22 Lacs

Navi Mumbai

Work from Office

Job Title: Senior Engineer (HW_GCS_2) Department: R&D - Mech Location: Navi Mumbai, India Job Type: Full-time | On-Site Seniority Level: Mid-Senior Years of Experience:8 - 12 Years Minimum Qualification :Bachelor's Degree Job Description: As a Mechatronics/Electronics Engineer , you will be crucial in developing and implementing electro-mechanical systems for avionics and other electronic systems in our aerospace projects. You will work closely with cross-functional teams to ensure the successful integration of electronic components with mechanical systems into our Unmanned Aerial Systems (UAS) . This role offers a unique opportunity to work on challenging projects at the forefront of aerospace technology. Key Responsibilities: You will be responsible for spearheading complex electronics system design, analysis, and integration. A deep technical expert, this role requires a profound understanding of the core principles of electronics engineering with proficiency in mechanical engineering, emphasising designing state-of-the-art solutions for challenging applications. Design & Development: Responsible for the design and development of high-precision electro-mechanical systems. Defining selection criteria for key electronics & mechanical components, testing and validating them for use in different sub-systems. Coordinate with multidisciplinary teams to seamlessly integrate embedded systems with mechanical systems, ensuring alignment in design parameters and tolerance considerations. Develop and implement comprehensive testing and verification strategies to ensure the robustness and integrity of embedded software & mechanical systems throughout the development lifecycle. Identify and mitigate risks associated with embedded software development & mechanical systems proactively addressing issues to ensure project success. Experience in designing electro-mechanical/robotic systems through the use of very strong technical fundamentals. Proficiency in designing electronic circuits and embedded system circuits using microcontrollers (e.g., STM32) with strong fundamentals in Electronics. Proficiency in reviewing and modifying circuits, wiring, and PCB layouts. Knowledge in Embedded C/C++ & familiarity with embedded software development. Developing Best Practices: Work with world-class Safety standards to implement best practices within a team. To work with first principles to achieve engineering solutions' highest robustness and value addition. Establish practices to deliver a design that is highest performance, reliable, scalable to manufacture, easy to maintain and re-usable. Skills & Qualification: Bachelors or Master's degree in Electronics, Mechatronics, Robotics, Aeronautical (Avionics). Strong electro-mechanical design instincts and a thorough understanding of dynamics & control. Hands-on expertise in housing electronic & circuit assembly. Working/basic Knowledge of any parametric modelling CAD software. Basic in GD&T and tolerance stack-up. Has a deep appreciation for technology evolution, and modern engineering practices, and deployed in world-class product development processes. Worked in an Indian or global company that delivers high-quality systems integrating mechanical & electronics hardware and/or studied at a reputed academic institution while demonstrating initiative and rigour to learn and create innovative engineering solutions.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

18 - 25 Lacs

Navi Mumbai

Work from Office

Design and development of high-precision electro-mechanical systems Develop and implement comprehensive testing and verification strategies Identify and mitigate risks associated with embedded software development & mechanical systems Required Candidate profile Bachelor’s or Master's degree in Electronics, Mechatronics, robotics, Aeronautical (Avionics). 8-12 years experience Strong electro-mechanical design instincts Basic in GD&T and tolerance stack-up.

Posted 2 weeks ago

Apply

3.0 - 4.0 years

6 - 7 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Consultant Location: Chennai Work Type: Hybrid Position Description: Software development using React/Angular full stack work with Tech Anchors, Product Managers and the Team internally and across other Teams Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end Software development using TDD approach Experience using GCP products & services Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Skills Required: 3+ years of experience in React/Angular full stack software development 3+ years experience in Cloud technologies & services, preferably GCP Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP etc., Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Skills Preferred: Good Communication, Presentation and Collaboration Skills Experience Required: 2 to 5 yrs Experience Preferred: API development and GCP deployment Education Required: BE, BTech, MCA, M.Sc, ME TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS , Spark , Hive , Sqoop Strong Python experience Hands on SQL , HQL to write optimized queries Strong hands-on experience with GCP Big Query , Data Proc , Airflow DAG , Dataflow , GCS , Pub/sub , Secret Manager , Cloud Functions , Beams . Ability to work in fast passed collaborative environment work with various stakeholders to define strategic optimization initiatives. Deep understanding of distributed computing, memory turning and spark optimization. Familiar with CI/CD workflows, Git . Experience in designing modular , automated , and secure ETL frameworks .

Posted 3 weeks ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Bengaluru, Karnataka, India

On-site

We are seeking a skilled GCP Data Engineer to join our team in India. The ideal candidate will have a strong background in designing and implementing data processing systems using Google Cloud Platform, with a focus on building scalable and efficient data pipelines. Responsibilities Designing, building, and maintaining scalable data processing systems on Google Cloud Platform (GCP). Developing data pipelines to ensure efficient data flow and transformation. Collaborating with data scientists and analysts to understand data requirements and deliver appropriate solutions. Implementing data security and compliance measures in accordance with best practices. Monitoring and optimizing data storage and processing performance. Troubleshooting and resolving data-related issues in a timely manner. Skills and Qualifications 3-7 years of experience in data engineering or related field. Proficiency in GCP services such as BigQuery, Dataflow, Cloud Storage, and Pub/Sub. Strong experience with SQL and NoSQL databases. Familiarity with data modeling and ETL processes. Knowledge of programming languages such as Python or Java. Understanding of data warehousing concepts and best practices. Experience with CI/CD tools and practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Gurgaon, Haryana, India

On-site

We are seeking a skilled GCP Data Engineer to join our team in India. The ideal candidate will have a strong background in designing and implementing data processing systems using Google Cloud Platform, with a focus on building scalable and efficient data pipelines. Responsibilities Designing, building, and maintaining scalable data processing systems on Google Cloud Platform (GCP). Developing data pipelines to ensure efficient data flow and transformation. Collaborating with data scientists and analysts to understand data requirements and deliver appropriate solutions. Implementing data security and compliance measures in accordance with best practices. Monitoring and optimizing data storage and processing performance. Troubleshooting and resolving data-related issues in a timely manner. Skills and Qualifications 3-7 years of experience in data engineering or related field. Proficiency in GCP services such as BigQuery, Dataflow, Cloud Storage, and Pub/Sub. Strong experience with SQL and NoSQL databases. Familiarity with data modeling and ETL processes. Knowledge of programming languages such as Python or Java. Understanding of data warehousing concepts and best practices. Experience with CI/CD tools and practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Mumbai, Maharashtra, India

On-site

We are seeking a skilled GCP Data Engineer to join our team in India. The ideal candidate will have a strong background in designing and implementing data processing systems using Google Cloud Platform, with a focus on building scalable and efficient data pipelines. Responsibilities Designing, building, and maintaining scalable data processing systems on Google Cloud Platform (GCP). Developing data pipelines to ensure efficient data flow and transformation. Collaborating with data scientists and analysts to understand data requirements and deliver appropriate solutions. Implementing data security and compliance measures in accordance with best practices. Monitoring and optimizing data storage and processing performance. Troubleshooting and resolving data-related issues in a timely manner. Skills and Qualifications 3-7 years of experience in data engineering or related field. Proficiency in GCP services such as BigQuery, Dataflow, Cloud Storage, and Pub/Sub. Strong experience with SQL and NoSQL databases. Familiarity with data modeling and ETL processes. Knowledge of programming languages such as Python or Java. Understanding of data warehousing concepts and best practices. Experience with CI/CD tools and practices. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Machine Learning, ML, ml architectures and lifecycle, Airflow, Kubeflow, MLFlow, Spark, Kubernetes, Docker, Python, SQL, machine learning platforms, BigQuery, GCS, Dataproc, AI Platform, Search Ranking, Deep Learning, Deep Learning Frameworks, PyTorch, TensorFlow About the job Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their team's hybrid work schedule requirements. Who We Are Wayfairs Advertising business is rapidly expanding, adding hundreds of millions of dollars in profits to Wayfair. We are building Sponsored Products, Display & Video Ad offerings that cater to a variety of Advertiser goals while showing highly relevant and engaging Ads to millions of customers. We are evolving our Ads Platform to empower advertisers across all sophistication levels to grow their business on Wayfair at a strong, positive ROI and are leveraging state of the art Machine Learning techniques. What youll do Provide technical leadership in the development of an automated and intelligent advertising system by advancing the state-of-the-art in machine learning techniques to support recommendations for Ads campaigns and other optimizations. Design, build, deploy and refine extensible, reusable, large-scale, and real-world platforms that optimize our ads experience. Work cross-functionally with commercial stakeholders to understand business problems or opportunities and develop appropriately scoped machine learning solutions Collaborate closely with various engineering, infrastructure, and machine learning platform teams to ensure adoption of best-practices in how we build and deploy scalable machine learning services Identify new opportunities and insights from the data (where can the models be improved? What is the projected ROI of a proposed modification?) Research new developments in advertising, sort and recommendations research and open-source packages, and incorporate them into our internal packages and systems. Be obsessed with the customer and maintain a customer-centric lens in how we frame, approach, and ultimately solve every problem we work on. We Are a Match Because You Have: Bachelor's or Masters degree in Computer Science, Mathematics, Statistics, or related field. 6-9 years of industry experience in advanced machine learning and statistical modeling, including hands-on designing and building production models at scale. Strong theoretical understanding of statistical models such as regression, clustering and machine learning algorithms such as decision trees, neural networks, etc. Familiarity with machine learning model development frameworks, machine learning orchestration and pipelines with experience in either Airflow, Kubeflow or MLFlow as well as Spark, Kubernetes, Docker, Python, and SQL. Proficiency in Python or one other high-level programming language Solid hands-on expertise deploying machine learning solutions into production Strong written and verbal communication skills, ability to synthesize conclusions for non-experts, and overall bias towards simplicity Nice to have Familiarity with Machine Learning platforms offered by Google Cloud and how to implement them on a large scale (e.g. BigQuery, GCS, Dataproc, AI Notebooks). Experience in computational advertising, bidding algorithms, or search ranking Experience with deep learning frameworks like PyTorch, Tensorflow, etc.

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 5 - 15 Yrs Location: Pan India Job Description: Minimum 2 years hands on experience in GCP Development ( Data Engineering ) Position : Developer / Tech Lead / Architect Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :

Posted 1 month ago

Apply

7.0 - 12.0 years

7 - 12 Lacs

Bengaluru

Work from Office

About the Role We are looking for a seasoned Engineering Manager well-versed with emerging technologies to join our team. As an Engineering Manager, you will ensure consistency and quality by shaping the right strategies. You will keep an eye on all engineering projects and ensure all duties are fulfilled . You will analyse other employees tasks and carry on collaborations effectively. You will also transform newbies into experts and build reports on the progress of all projects. What you will do Design tasks for other engineers as per Meeshos guidelines Perform regular performance evaluation and share and seek feedback Keep a closer look on various projects and monitor the progress Carry on smooth collaborations with the sales team and design teams to innovate on new products Manage engineers and take ownership of the project while ensuring product scalability Conduct regular meetings to plan and develop reports on the progress of projects What you will need Bachelor's/Masters in computer science At least 7+ years professional experience At least 2 years of experience in managing software development teams Able to drive sprints and OKRs Deep understanding of transactional and NoSQL DBs Deep understanding of Messaging systems Kafka Good experience on cloud infrastructure - AWS/GCS Good to have: Data pipelines, ES Exceptional team managing skills; experience in building large scale distributed Systems Experience in Scalable Systems Expertise in Java/Python and multithreading

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai

Work from Office

Job Summary: UPS Enterprise Data Analytics team is looking for a talented and motivated Data Scientist to use statistical modelling, state of the art AI tools and techniques to solve complex and large-scale business problems for UPS operations. This role would also support debugging and enhancing existing AI applications in close collaboration with the Machine Learning Operations team. This position will work with multiple stakeholders across different levels of the organization to understand the business problem, develop and help implement robust and scalable solutions. You will be in a high visibility position with the opportunity to interact with the senior leadership to bring forth innovation within the operational space for UPS. Success in this role requires excellent communication to be able to present your cutting-edge solutions to both technical and business leaderships. Responsibilities: Become a subject matter expert on UPS business processes and data to help define and solve business needs using data, advanced statistical methods and AI Be actively involved in understanding and converting business use cases to technical requirements for modelling. Query, analyze and extract insights from large-scale structured and unstructured data from different data sources utilizing different platforms, methods and tools like BigQuery, Google Cloud Storage, etc. Understand and apply appropriate methods for cleaning and transforming data, engineering relevant features to be used for modelling. Actively drive modelling of business problem into ML/AI models, work closely with the stakeholders for model evaluation and acceptance. Work closely with the MLOps team to productionize new models, support enhancements and resolving any issues within existing production AI applications. Prepare extensive technical documentation, dashboards and presentations for technical and business stakeholders including leadership teams. Qualifications Expertise in Python, SQL. Experienced in using data science-based packages like scikit-learn, numpy, pandas, tensorflow, keras, statsmodels, etc. Strong understanding of statistical concepts and methods (like hypothesis testing, descriptive stats, etc.), machine learning techniques for regression, classification, clustering problems, including neural networks and deep learning. Proficient in using GCP tools like Vertex AI, BigQuery, GCS, etc. for model development and other activities in the ML lifecycle. Strong ownership and collaborative qualities in the relevant domain. Takes initiative to identify and drive opportunities for improvement and process streamline. Solid oral and written communication skills, especially around analytical concepts and methods. Ability to communicate data through a story framework to convey data-driven results to technical and non-technical audience. Masters Degree in a quantitative field of mathematics, computer science, physics, economics, engineering, statistics (operations research, quantitative social science, etc.), international equivalent, or equivalent job experience. Bonus Qualifications NLP, Gen AI, LLM knowledge/experience Knowledge of Operations Research methodologies and experience with packages like CPLEX, PULP, etc. Knowledge and experience in MLOps principles and tools in GCP. Experience working in an Agile environment, understanding of Lean Agile principles.

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 10 Lacs

Chennai, Tamil Nadu, India

On-site

Google Cloud Platform GCS, DataProc, Big Query, Data Flow Programming Languages Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform, GCS, DataProc, Big Query, Data Flow, Composer, Data Processing, Java

Posted 1 month ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Position Summary As a Data Engineer III at Walmart, you will design, build, and maintain scalable data systems and architectures to enable advanced business intelligence and analytics. Your role will be pivotal in ensuring data quality, integrity, and security to support Walmart's data-driven decision-making and operational efficiency. You will collaborate closely with cross-functional teams to implement robust data solutions that drive business growth. About the Team The Walmart Last Mile Data team operates within the Data and Customer Analytics organization, focusing on optimizing Walmart's last mile delivery operations. This dynamic group uses cutting-edge data engineering and analytics technologies to improve delivery routing, reduce costs, and enhance customer satisfaction. What You'll Do Design, develop, and deploy data pipelines and integration solutions using technologies such as Spark, Scala, Python, Airflow, and Google Cloud Platform (GCP). Build scalable, efficient data processing systems while optimizing workflows to ensure high data quality and availability. Monitor, troubleshoot, and tune data pipelines to maximize reliability and performance. Partner with executive leadership, product, data, and design teams to address data infrastructure needs and technical challenges. Provide data scientists and analysts with well-structured data sets and tools to facilitate analytics and reporting. Develop analytics tools that contribute to Walmart's position as an industry leader. Stay current with data engineering trends and technologies, incorporating best practices to enhance data infrastructure. What You'll Bring Minimum 5 years of proven experience as a Data Engineer. Strong programming skills in Scala and experience with Spark for large-scale data processing and analytics. Expertise with Google Cloud Platform services such as BigQuery, Google Cloud Storage (GCS), and Dataproc. Experience building near real-time data ingestion pipelines using Kafka and Spark Structured Streaming. Solid knowledge of data modeling, data warehousing concepts, and ETL processes. Proficiency in SQL and NoSQL databases. Experience with version control systems, preferably Git. Familiarity with distributed computing frameworks and working with large-scale data sets. Understanding of CI/CD pipelines and tools such as Jenkins or GitLab CI. Experience with workflow schedulers like Airflow. Strong analytical and problem-solving abilities. Familiarity with BI and visualization tools such as Tableau or Looker. Experience or interest in Generative AI is a plus but not required. About Walmart Global Tech At Walmart Global Tech, your work impacts millions worldwide by simplifying complex challenges through technology. Join a diverse team of innovators shaping the future of retail, where people and technology come together to drive lasting impact. We support career growth with continuous learning and flexible hybrid work models. Minimum Qualifications Bachelor's degree in Computer Science or related field with at least 2 years of software engineering experience. OR 4 years of relevant software engineering or data engineering experience without a degree. OR Master's degree in Computer Science.

Posted 1 month ago

Apply

6.0 - 8.0 years

18 - 20 Lacs

Chennai

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Chennai

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,DataProc,Big Query,Data Flow,Composer,Data Processing,Java*

Posted 1 month ago

Apply

7.0 - 10.0 years

13 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

We are seeking skilled Salesforce Data Cloud Developer to join our dynamic IT team. This role involves developing and maintaining Salesforce Data Cloud solutions to enhance our mobile services and customer experience. The ideal candidate will have a strong background in Salesforce development, data integration, and cloud technologies. • Clear understanding on Legacy Sytems and databases and integrating them with SF ecosystem . Streamline data ingestion processes between Salesforce Data Cloud and our databases to ensure seamless data flow and accuracy. • Utilize Salesforce Data Cloud to gain deeper insights into customer behavior and preferences, driving personalized mobile experiences. • Implement AI-driven solutions and automation within Salesforce Data Cloud to optimize mobile service delivery and customer engagement. • Enhance mobile data strategies to support innovative mobile solutions and improve overall user experience. • Knowledge of AWS , GCS to support the integration with Salesforce , Thorough knowledge on Data cloud available connectors • Have Worked on DLO's , DMOs , SQL , segmentation and activation • Should have clear understanding on Integration of Data Cloud with Marketing cloud • Have worked on Data Cloud available connectors like Biq Query and GCS . Development: Design, develop, and implement custom Salesforce Data Cloud applications and enhancements tailored to organization requirement . • Integration: Perform data integration and migration tasks, ensuring data accuracy and integrity across Salesforce and other systems. • Collaboration: Work closely with cross-functional teams, including marketing and techincal to align Salesforce solutions with business objectives. • Documentation: Create and maintain comprehensive documentation on processes, policies, application configurations, and user guides. • Testing: Conduct thorough testing and debugging of Salesforce applications to ensure high performance and reliability. Bachelors Degree in Computer Science, Information Technology, Business, Engineering, or a related field • Minimum of 4-5 years of experience in Salesforce Eco-system and with at least 2-3 years of hands-on experience with Salesforce Data cloud • Strong ability to manage and communicate with both technical and non-technical stakeholders. • Solid understanding of software development, systems integration, and cloud technologies • Strong strategic thinking and planning skills. • Ability to work in a fast-paced, dynamic environment and manage multiple priorities • Experience with Agile methodologies and version control systems like Git MBA or advanced degree in a related field. • Salesforce Data Cloud Certification (e.g., SalesforceData cloud consultant) • Knowledge of Cloud environments like AWS, Azure, or Google Cloud Platform. • Experience in API integration with Legacy systems

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Req ID: 316017 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Zabbix Administrator to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Zabbix Administration and Support Roles and responsibilities - In-depth knowledge of Enterprise Monitoring tool architecture, administration, and configuration. Technically manage the design and implementation of Zabbix tool. Hands on experience of end-to-end deployment. In-depth Knowledge of Systems Mgmt., Monitoring Tools, ITIL process, Integrations with different tools & scripting. Good understanding of Automation & Enterprise-wide monitoring tooling solutions. Hands on experience in integrating Enterprise Monitoring tools with ITSM platforms. Minimum (5) years hands on experience in administering and configuring Enterprise Monitoring tools at an L3 level. Having knowledge of IT Infra Programming/Scripting (Shell Jason, MySQL Python Perl) Good understanding of OS (Windows and Unix). Must have good knowledge of public cloud platforms (Azure, AWS, GCS) Install and configure software and hardware. Apply Zabbix patches and upgrades once available to upgrade the environment. Lead troubleshooting of issues and outages. Provide technical support as requested, for internal and external customers primarily for Zabbix Undertake individual assignments or work on a project as part of a larger team analyzing customer requirements, gathering and analyzing data and recommending solutions. Ensure assignments are undertaken consistently and with quality. Produce and update assignment documentation as required. Experienced in customer interaction. Good communication skills (verbal/written). Experienced in dealing with internal and external stake holders independently during transitions and project driven activities. Willing to work in 24 . 7 work environment. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 month ago

Apply

2 - 4 years

5 - 8 Lacs

Pune

Work from Office

We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.

Posted 2 months ago

Apply

4 - 9 years

10 - 14 Lacs

Pune

Hybrid

Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies