Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
9 - 19 Lacs
Hyderabad
Work from Office
We Advantum Health Pvt. Ltd - US Healthcare MNC looking for Senior AI/ML Engineer. We Advantum Health Private Limited is a leading RCM and Medical Coding company, operating since 2013. Our Head Office is located in Hyderabad, with branch operations in Chennai and Noida. We are proud to be a Great Place to Work certified organization and a recipient of the Telangana Best Employer Award. Our office spans 35,000 sq. ft. in Cyber Gateway, Hitech City, Hyderabad Job Title: Senior AI/ML Engineer Location: Hitech City, Hyderabad, India Work from office Ph: 9177078628, 7382307530, 9059683624 Address: Advantum Health Private Limited, Cyber gateway, Block C, 4th floor Hitech City, Hyderabad. Location: https://www.google.com/maps/place/Advantum+Health+India/@17.4469674,78.3747158,289m/data=!3m2!1e3!5s0x3bcb93e01f1bbe71:0x694a7f60f2062a1!4m6!3m5!1s0x3bcb930059ea66d1:0x5f2dcd85862cf8be!8m2!3d17.4467126!4d78.3767566!16s%2Fg%2F11whflplxg?entry=ttu&g_ep=EgoyMDI1MDMxNi4wIKXMDSoASAFQAw%3D%3D Job Summary: We are seeking a highly skilled and motivated Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure to support analytics, machine learning, and business intelligence initiatives. You will work closely with data analysts, scientists, and engineers to ensure data availability, reliability, and quality across the organization. Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines for ingesting and transforming large volumes of structured and unstructured data Build and optimize data infrastructure for scalability, performance, and reliability Collaborate with cross-functional teams to understand data needs and translate them into technical solutions Implement data quality checks, monitoring, and alerting mechanisms Manage and optimize data storage solutions (data warehouses, data lakes, databases) Ensure data security, compliance, and governance across all platforms Automate data workflows and optimize data delivery for real-time and batch processing Participate in code reviews and contribute to best practices for data engineering Required Skills and Qualifications: Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related field 3+ years of experience in data engineering or related roles Strong programming skills in Python, Java, or Scala Proficiency with SQL and working with relational databases (e.g., PostgreSQL, MySQL) Experience with data pipeline and workflow orchestration tools (e.g., Airflow, Prefect, Luigi) Hands-on experience with cloud platforms (AWS, GCP, or Azure) and cloud data services (e.g., Redshift, BigQuery, Snowflake) Familiarity with distributed data processing tools (e.g., Spark, Kafka, Hadoop) Solid understanding of data modeling, warehousing concepts, and data governance Preferred Qualifications: Experience with CI/CD and DevOps practices for data engineering Knowledge of data privacy regulations such as GDPR, HIPAA, etc. Experience with version control systems like Git Familiarity with containerization (Docker, Kubernetes) Follow us on LinkedIn, Facebook, Instagram, Youtube and Threads for all updates: Advantum Health Linkedin Page: https://www.linkedin.com/showcase/advantum-health-india/ Advantum Health Facebook Page: https://www.facebook.com/profile.php?id=61564435551477 Advantum Health Instagram Page: https://www.instagram.com/reel/DCXISlIO2os/?igsh=dHd3czVtc3Fyb2hk Advantum Health India Youtube link: https://youtube.com/@advantumhealthindia-rcmandcodi?si=265M1T2IF0gF-oF1 Advantum Health Threads link: https://www.threads.net/@advantum.health.india HR Dept, Advantum Health Pvt Ltd Cybergateway, Block C, Hitech City, Hyderabad Ph: 9177078628, 7382307530, 9059683624
Posted 2 weeks ago
2.0 - 5.0 years
8 - 15 Lacs
Pune
Work from Office
Data Analyst required-2 years’ experience in Google BigQuery and GCP environment iHadoop/Hive/PySpark Experience using data models and data dictionaries in a Banking and Financial Markets context GCP environment and tools Hadoop/Hive/Pyspark
Posted 2 weeks ago
7.0 - 12.0 years
15 - 25 Lacs
Chennai
Hybrid
Key Skills: DAO - AI/ML, Automation, Python, big query, Project Management, Agile, SDLC,JAVA, SQL Role & responsibilities 8+ years of experience in Data Science / Preferable in Automobile Engineering Domain SKILLS: Professional Skill: Business Analysis, Analytical Thinking, Problem Solving, Decision Making, Leadership, Managerial, Time Management, Domain Knowledge Work simplification - methods that maximize output while minimizing expenditure and cost. Analytics with Data - interprets data and turns it into information which can offer ways to improve a business Communication - Good verbal communication and interpersonal skills are essential for collaborating with customers Technical Skills: Python/Numpy, Seaborn, Pandas, Selenium, Beautiful Soup (basic), Spotfire, ML Libraries, RPA, R, Iron-Python, Html CSS, Javascript, SQL, HQL, Git/Gitlabee, Spark, Scala, Webservices, Spotfire/ Tableau, JIRA Tool Skill: Project management tools, Documentation tools, Modeling [wireframe] tools Database Skills: MsSQL, Postgres, MsAccess, Mongo DB Rigorous - The ability to analyse qualitative data quickly and rigorously Adaptability - Being able to adapt to changing environments and work processes
Posted 2 weeks ago
4.0 - 9.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.
Posted 2 weeks ago
3.0 - 8.0 years
14 - 24 Lacs
Chennai
Hybrid
Greetings! We have permanent opportunities for GCP Data Engineers in Chennai Location . Experience Required : 3 Years and above Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Skill Required: GCP Data Engineer, Advanced SQL, ETL Data pieplines, BigQuery, Dataflow, Bigtable, Data fusion, cloud spanner, python, java, javascript, If interested, kindly share the below details along with updated CV and to Narmadha.baskar @getronics.com Regards, Narmadha Getronics Recruitment team
Posted 2 weeks ago
6.0 - 11.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: Hyderabad/Bangalore/Pune/Delhi Skill: GCP Data Engineer - Proficiency in programming languages: Python - Expertise in data processing frameworks: Apache Beam (Data Flow), Kafka, - Hands-on experience with GCP services: Big Query, Dataflow, Composer, Spanner - Knowledge of data modeling and database design - Experience in ETL (Extract, Transform, Load) processes - Familiarity with cloud storage solutions - Strong problem-solving abilities in data engineering challenges - Understanding of data security and scalability - Proficiency in relevant tools like Apache Airflow Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in GCP: Rel Exp in Big Query: Rel Exp in Composer: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 2 weeks ago
10.0 - 20.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity
Posted 2 weeks ago
8.0 - 12.0 years
12 - 18 Lacs
Noida, Pune, Bengaluru
Work from Office
Lead the technical discovery process, assess customer requirements, and design scalable solutions leveraging a comprehensive suite of Data & AI services, including BigQuery, Dataflow, Vertex AI, Generative AI solutions, and advanced AI/ML services like Vertex AI, Gemini, and Agent Builder. Architect and demonstrate solutions leveraging generative AI, large language models (LLMs), AI agents, and agentic AI patterns to automate workflows, enhance decision-making, and create intelligent applications. Develop and deliver compelling product demonstrations, proofs-of-concept (POCs), and technical workshops that showcase the value and capabilities of Google Cloud. Strong understanding of data warehousing, data lakes, streaming analytics, and machine learning pipelines. Collaborate with sales to build strong client relationships, articulate the business value of Google Cloud solutions, and drive adoption. Lead and contribute technical content and architectural designs for RFI/RFP responses and technical proposals leveraging Google Cloud Services. Stay informed of industry trends, competitive offerings, and new Google Cloud product releases, particularly in the infrastructure and data/AI domains. Extensive experience in architecting & designing solutions on Google Cloud Platform, with a strong focus on: Data & AI services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI (ML Ops, custom models, pre-trained APIs), Generative AI (e.g., Gemini). Strong understanding of cloud architecture patterns, DevOps practices, and modern software development methodologies. Ability to work effectively in a cross-functional team environment with sales, product, and engineering teams. 5+ years of experience in pre-sales or solutions architecture, focused on cloud Data & AI platforms. Skilled in client engagements, technical presentations, and proposal development. Excellent written and verbal communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Location-Noida,Pune,Bengaluru,Hyderabad,Chennai
Posted 2 weeks ago
5.0 - 10.0 years
25 - 35 Lacs
Bengaluru
Remote
Responsibilities : Senior Google Cloud Platform (GCP) Administrator will be responsible for deploying, automating, and maintaining Cloud Infrastructure, as well as testing and supporting a diverse range of environments. In addition, the person is responsible for entire functioning of our GCP environment. Experience Required: 5+ years experience in GCP, Big Query administration Must have experience in various cloud services in GCP (Dataflow, composer, Cloud Function, BigQuery, AppEngine, LB and GKE) and AWS. Experience in Cloud Architecture and Design, both GCP and AWS. Experience in Tableau and Power BI preferred Collaborate with stakeholders to define cloud strategies that align with business goals. Develop and implement automated deployment pipelines for multi-cloud environments. Automate provisioning using terraform(IaC), scaling, and monitoring processes. Implement DevOps best practices, such as Continuous Integration/Continuous Deployment (CI/CD), version control, and automated testing. Utilize DevOps tools like Jenkins and GitHub Actions. Implement security best practices for multi-cloud environments, including identity and access management (IAM), encryption, and compliance. Manage Enterprise and Open source Kafka clusters Automation in Configuration like Patching and Setup of New Kafka Cluster using Ansible (roles, modules, Jinga template). Automation of various manual activities using Python. Experience in Docker and Kubernetes(K8s) The ability to work completely independently or with a team. Excellent verbal and written communication skills Experience in Informatica Excellent verbal and written communication skills Very detail-oriented in planning, implementation, documentation, and follow-up The ability to work completely independently or with a team Team player attitude with experience working in a collaborative environment.
Posted 2 weeks ago
4.0 - 6.0 years
20 - 22 Lacs
Bengaluru
Work from Office
Role Overview: We are looking for a ML/AI Gen AI Expert with 4-7 years of experience for executing AI/GenAI use cases as POCs . Media domain experience (OTT, DTH, Web) is a plus. Key Responsibilities: Identify, define, and deliver AI/ML and GenAI use cases in collaboration with business and technical stakeholders. Design, develop, and deploy models (ML and GenAI) using Google Clouds Vertex AI platform. Fine-tune and evaluate LLMs for domain-specific applications, ensuring responsible AI practices. Collaborate with data engineers and architects to ensure robust, scalable, and secure data pipelines feeding ML models. Document solutions, workflows, and experiments to support reproducibility, transparency, and handover readiness. Core Skills: Strong foundation in machine learning and deep learning, including supervised, unsupervised, and reinforcement learning. Hands-on experience with Vertex AI, including AutoML, Pipelines, Model Registry, and Generative AI Studio. Experience with LLMs and GenAI workflows, including prompt engineering, tuning, and evaluation. Python and ML frameworks proficiency (e.g., TensorFlow, PyTorch, scikit-learn, Hugging Face Transformers). Strong collaboration and communication skills to work cross-functionally with data, product, and business teams. Technical Skills: Vertex AI on Google Cloud Model training, deployment, endpoint management, and MLOps tooling. GenAI tools and APIs Hands-on with PaLM, Gemini, or other large language models via Vertex AI or open source. Python Proficient in scripting ML pipelines, data preprocessing, and model evaluation. ML/GenAI Libraries scikit-learn, TensorFlow, PyTorch, Hugging Face, LangChain. Cloud & DevOps Experience with GCP services (BigQuery, Cloud Functions, Cloud Storage), CI/CD for ML, and containerization (Docker/Kubernetes) Experience in the media domain (OTT, DTH, Web) and handling large-scale media datasets. Immediate Joiners.
Posted 2 weeks ago
10.0 - 18.0 years
0 - 0 Lacs
Chennai
Work from Office
We are searching for a talented Salesforce Engineer IV to join our fun and committed Enterprise Applications team and to get plugged into our quickly growing company. You’ll get the chance to lead teams, build multiple enterprise custom Salesforce applications, and guide the formation of a high performing Salesforce organization. Your commitment to removing organizational inefficiencies and maintaining a healthy application environment will play a key role in enabling future growth. You will be integral to building highly scalable business application solutions and developing innovative solutions for business process automation. You will also impact people in a meaningful way and work with a highly collaborative, passionate group of Product and Technology professionals. What you will do: Work with domain teams to scope and estimate work Provide technical leadership and guidance to the broader team about platform fit and approach Develop and utilize coding best practices, standards and frameworks for implementation by engineers on the team Design and implement systems in Salesforce to support automation of our CRM, SaaS operations, and customer support processes Participate in storyboard and solution design sessions – recommend alternative approaches and best practices, define technical impact, and provide sizing estimates Follow an iterative software development methodology and contribute to all phases of the software development lifecycle and support processes Build integration components to transition data to and from various systems Manage Salesforce integration with existing systems and third-party providers Assist in the hiring and development of a high-performing development team What you should have: 7-10 years of experience as a SFDC Developer with Sales Cloud and Service, including the use of data tools (e.g. Data Loader, Big Query) Knowledge of Web Services and REST API Experience with VS Code, Apex, Lightning Web Components, and Flows Experience with frameworks and best practices in Salesforce ecosystem Highly organized, success-driven individual with a “can do” attitude Ability to organize and communicate technical requirements and ideas with team members of various technical experience Interest and ability in taking a leadership role within the Enterprise Apps team Preferred: SFDC Platform Developer I, SFDC Platform Developer II, and Javascript Developer I certifications BS/MS in Computer Science or similar engineering intensive program Experience with FSL, CPQ, Territory Management, SF Maps, CI/CD (AutoRABIT) Experience in a technology company a plus Experience with event-based architecture and design Experience interfacing with business and technical teams Suggested certifications: Certified Administrator I Platform App Builder Platform Developer I/II Integration Architecture Designer Identity and Access Management Designer Development Lifecycle and Deployment Designer #LI-MS1
Posted 2 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
Role Definition Plan and execute the removal, modification, rework, and installation of package controls and package systems to upgrade industrial gas turbine packages to customer specifications and schedule requirements. Apply knowledge in related turbo-machinery fields, and conform to all EHS (Environment, Health, Safety & Security), quality, electrical, and solar standards during the performance of duties. Responsibilities Use work permit program understanding and compliance to execute job responsibilities. Participate in general safety meetings/briefings and submit safety suggestions as appropriate. Plan, execute, and/or assist in the removal of obsolete material including control systems, inter-connect wiring, package system components, and cold loop checks prior to demobilizing, and conduit/cable tray & tubing per project specifications. Install and/or assist in the placement of new control consoles. Plan, develop, and execute the layout for all new package system components, replacement, and/or modification of all conduit/cable tray & tubing necessary to accommodate new controls, components, and package systems per design specifications. Rewire package junction box(es), new components, and package inter-connect wiring per engineering specifications. Provide leadership and customer support on projects of lower complexity and support technical and administrative development of less experienced field technicians. Skill Descriptors Service Excellence Knowledge of customer service concepts and techniques; ability to meet or exceed customer needs and expectations and provide excellent service directly or indirectly. Provides a quality of service described by customers as excellent. Resolves common customer problems. Responds to unexpected customer requests with a sense of urgency and positive action. Provides direct service to internal or external customers. Documents customer complaints in a timely manner. Initiative Being proactive and committing to action on self-identified job responsibilities and challenges; ability to seek out work and the drive to accomplish goals. Identifies and exploits own strengths; minimizes limitations. Provides appropriate degrees of attention to both personal and professional priorities. Explains how own motivation relates to the workplace. Utilizes available tools or approaches to increase knowledge of self-motivation. Learns and uses resources the organization has to assess and enhance team motivation. Problem Solving Knowledge of approaches, tools, and techniques for recognizing, anticipating, and resolving organizational, operational, or process problems; ability to apply problem-solving knowledge appropriately to diverse situations. Identifies and documents specific problems and resolution alternatives. Examines a specific problem and understands the perspective of each involved stakeholder. Develops alternative techniques for assessing accuracy and relevance of information. Helps to analyze risks and benefits of alternative approaches and obtain decisions on resolution. Uses fact-finding techniques and diagnostic tools to identify problems. Technical Excellence Knowledge of a given technology and various application methods; ability to develop and provide solutions to significant technical challenges. Provides effective technical solutions to routine functional challenges via sound technical competence, effectively examining implications of events and issues. Effectively performs the technical job aspects, continuously building knowledge and keeping up-to-date on technical and procedural job components. Applies technical operating and project standards based on achieving excellence in delivered products, technologies, and services. Applies current procedures and technologies to help resolve technical issues in ones general area of technical competence. Helps others solve technical or procedural problems or issues. Power Generation Knowledge of working principles, methods, equipment, and processes of power generation; ability to apply the knowledge appropriately within the power supply sector. Explains the roles and responsibilities of power generation within the electric power industry. Identifies the features and properties of the power generation sector. Describes the working principles of turbines and power generators. Documents relevant laws and regulations within the power generation sector. Safety (Oil and Gas) Knowledge of procedures, practices, considerations, and regulatory requirements for the safety and protection of workers, community, environment, and company assets; ability to identify and respond accordingly to work-related hazards. Describes own experience working with safety practices and equipment. Discusses procedures for identifying and reporting safety violations and accidents. Relates incidents with product-specific hazards and associated first aid response. Identifies training and documentation on safety and injury prevention procedures. Identifies personal protective equipment required or recommended for manufacturing staff. Oil and Gas Equipment Knowledge of various types of equipment used in the oil and gas industry and the systems and processes involved in the exploration, production, and refining of oil and gas; ability to operate, maintain, troubleshoot, and repair equipment used in the oil and gas industry. Demonstrates an understanding of basic principles of pumps, compressors, and other equipment used in the oil and gas industry. Understands the purpose and function of common oil and gas equipment, such as separators, heat exchangers, and valves. Describes common types of oil and gas equipment and explains their basic operation. Explains basic principles of hydraulic and pneumatic systems used in oil and gas equipment. Troubleshooting Technical Problems Knowledge of troubleshooting approaches, tools, and techniques; ability to anticipate, detect, and resolve technical problems effectively.
Posted 2 weeks ago
8.0 - 13.0 years
19 - 25 Lacs
Bengaluru
Work from Office
In this role, you will play a key role in designing, building, and optimizing scalable data products within the Telecom Analytics domain. You will collaborate with cross-functional teams to implement AI-driven analytics, autonomous operations, and programmable data solutions. This position offers the opportunity to work with cutting-edge Big Data and Cloud technologies, enhance your data engineering expertise, and contribute to advancing Nokias data-driven telecom strategies. If you are passionate about creating innovative data solutions, mastering cloud and big data platforms, and working in a fast-paced, collaborative environment, this role is for you! You have: Bachelors or masters degree in computer science, Data Engineering, or related field with 8+ years of experience in data engineering with a focus on Big Data, Cloud, and Telecom Analytics. Hands-on expertise in Ab Initio for data cataloguing, metadata management, and lineage. Skills in data warehousing, OLAP, and modelling using Big Query, Click house, and SQL. Experience with data persistence technologies like S3, HDFS, and Iceberg. Hands-on experience with Python and scripting languages. It would be nice if you also had: Experience with data exploration and visualization using Superset or BI tools. Knowledge in ETL processes and streaming tools such as Kafka. Background in building data products for the telecom domain and understanding of AI and machine learning pipeline integration. Data GovernanceManage source data within the Metadata Hub and Data Catalog. ETL DevelopmentDevelop and execute data processing graphs using Express It and the Co-Operating System. ETL OptimizationDebug and optimize data processing graphs using the Graphical Development Environment (GDE). API IntegrationLeverage Ab Initio APIs for metadata and graph artifact management. CI/CD ImplementationImplement and maintain CI/CD pipelines for metadata and graph deployments. Team Leadership & MentorshipMentor team members and foster best practices in Ab Initio development and deployment.
Posted 2 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Master Of Engineering,MCA,MTech,BTech,BE,BCA,Bachelor of Engineering,Bachelor Of Science,Master Of Science Service Line Application Development and Maintenance Responsibilities Design and implement cloud-native solutions on Google Cloud PlatformDeploy and manage infrastructure using Terraform, Cloud Deployment Manager, or similar IaC toolsManage GCP services such as Compute Engine, GKE (Kubernetes), Cloud Storage, Pub/Sub, Cloud Functions, BigQuery, etc.Optimize cloud performance, cost, and scalabilityEnsure security best practices and compliance across the GCP environmentMonitor and troubleshoot issues using Stackdriver/Cloud MonitoringCollaborate with development, DevOps, and security teamsAutomate workflows, CI/CD pipelines using tools like Jenkins, GitLab CI, or Cloud Build Additional Responsibilities: GCP Professional certification (e.g., Professional Cloud Architect, Cloud Engineer)Experience with hybrid cloud or multi-cloud architectureExposure to other cloud platforms (AWS/Azure) is a plusStrong communication and teamwork skills Technical and Professional : 3–5 years of hands-on experience with GCPStrong expertise in Terraform, GCP networking, and cloud securityProficient in container orchestration using Kubernetes (GKE)Experience with CI/CD, DevOps practices, and shell scripting or PythonGood understanding of IAM, VPC, firewall rules, and service accountsFamiliarity with monitoring/logging tools like Stackdriver or PrometheusStrong problem-solving and troubleshooting skills Preferred Skills: .Net Java Python Java-Springboot Cloud Platform -Google Cloud Platform Developer-GCP/ Google Cloud
Posted 2 weeks ago
5.0 - 7.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BTech,BSc,BCom,MTech,MSc Service Line Cloud & Infrastructure Services Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technologies - 1. Bigdata, RDBMS, Google BigQuery admin2. PostgreSQL DBA Preferred Skills: Technology-Big Data-Oracle BigData Appliance Technology-Database-Database - RDBMS Others Technology-Database Administration-PostGreSQL Technology-Cloud Platform-GCP Database-Google BigQuery
Posted 2 weeks ago
5.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillPythonSecondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Machine Learning-Python Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-GCP Devops
Posted 2 weeks ago
5.0 - 9.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BSc,BCA,MCA,MSc,MTech Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : Technology-Cloud Platform-GCP Data Analytics-Looker,Technology-Cloud Platform-GCP Database-Google BigQuery Preferred Skills: Technology-Cloud Platform-Google Big Data Technology-Cloud Platform-GCP Data Analytics
Posted 2 weeks ago
12.0 - 19.0 years
30 - 40 Lacs
Pune, Chennai, Bengaluru
Work from Office
Strong understanding of data warehousing and data modeling Proficient understanding of distributed computing principles - Hadoop v2, MapReduce, HDFS Strong data engineering skills on GCP cloud platforms Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, Big Query Experience with building stream-processing systems, using solutions such as Storm or Spark- Streaming Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala Experience with Spark, SQL, and Knowledge of various ETL techniques and frameworks, such as Flume, Apache NiFi, or Experience with various messaging systems, such as Kafka or Good understanding of Lambda Architecture, along with its advantages and drawbacks
Posted 2 weeks ago
5.0 - 7.0 years
19 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
We are seeking a mid-level GCP Data Engineer with 4+ years of experience in ETL, Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles. Qualifications: 4+ years of experience in ETL & Data Warehousing Should have excellent leadership & communication skills Should have experience in developing Data Engineering solutions Airflow, GCP BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc. Should have built solution automations in any of the above ETL tools Should have executed at least 2 GCP Cloud Data Warehousing projects Should have worked at least 2 projects using Agile/SAFe methodology Should Have mid-level experience in Pyspark and Teradata Should Have mid-level experience in Should have working experience on any DevOps tools like GitHub, Jenkins, Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML files & written complex SQL queries for data analysis and extraction Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Data Quality & Data Mapping Education: B.Tech. /B.E. in Computer Science or related field. Certifications: Google Cloud Professional Data Engineer Certification. Roles and Responsibilities Analyze the different source systems, profile data, understand, document & fix Data Quality issues Gather requirements and business process knowledge in order to transform the data in a way that is geared towards the needs of end users Write complex SQLs to extract & format source data for ETL/data pipeline Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration Design, Develop and Test ETL/Data pipelines Design & build metadata-based frameworks needs for data pipelines Write Unit Test cases, execute Unit Testing and document Unit Test results Deploy ETL/Data pipelines Use DevOps tools to version, push/pull code and deploy across environments Support team during troubleshooting & debugging defects & bug fixes, business requests, environment migrations & other adhoc requests Do production support, enhancements and bug fixes Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations Leverage ITIL concepts to circumvent incidents, manage problems and document knowledge Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources Stay current on industry best practices and emerging technologies in data analysis and cloud computing, particularly within the GCP ecosystem
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Why this job matters We are searching for a proficient AI/ML engineer who can help us to extract value from our data. The resource will be responsible for E2E processes including data collection, cleaning & pre-processing, training of the models and deployment in all production and non-production environments. What youll be doing Understanding business objectives and developing models that help to achieve them, along with metrics to track their progress. Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Verifying data quality, and/or ensuring it via data cleaning. Supervising the data acquisition process if more data is needed. Defining validation strategies Defining the pre-processing or feature engineering to be done on given data. Defining data augmentation pipelines. Training models and tuning their hyperparameters. analysing the errors of the model and designing strategies to overcome them. Perform statistical analysis and fine-tuning using test results. Train and retrain systems when necessary. Strong knowledge on model deployment pipeline MLOPS and knowledge of AWS/GCP deployment. Skills Required Proven experience (4 or more years) as a Machine Learning Engineer/ Artificial Intelligence Engineer or similar role. Solving business problems using Machine Learning algorithms, Deep Learning/Neural Network algorithms, Sequential model development, and Time series data modelling. Experience with Computer Vision techniques, Convolutional Neural Networks (CNN), Generative AI, and Large Language Models (LLMs) Experience with deploying models using MLOps pipelines. Proficiency in handling both structured and unstructured data, including SQL, BigQuery, and DataProc. Hands-on experience with API development using frameworks like Flask, Django, and FastAPI. Automating business and functional operations using AIOps. Experience with cloud platforms such as GCP and AWS, and tools like Qlik (Added advantage) Understanding of data structures, data modelling and software architecture. Expertise in visualizing and manipulating big datasets. Deep knowledge of math, probability, statistics and algorithms. Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas. Knowledge in R or Java is a plus. Proficiency in TensorFlow or Keras and OpenCV is a plus. Excellent communication skills. Team player. Outstanding analytical and problem-solving skill. Familiarity with Linux environment. Low to medium familiarity with JIRA, GIT, Nexus, Jenkins etc is a plus. Minimum educational qualification: BE/B.Tech or similar degree in relevant field. The skills youll need Troubleshooting Agile Development Database Design/Development Debugging Programming/Scripting Microservices/Service Oriented Architecture Version Control IT Security Cloud Computing Continuous Integration/Continuous Deployment Automation & Orchestration Software Testing Application Development Algorithm Design Software Development Lifecycle Decision Making Growth Mindset Inclusive Leadership
Posted 2 weeks ago
3.0 - 5.0 years
13 - 17 Lacs
Hyderabad, Chennai
Work from Office
Role & responsibilities Job Description: A detail-oriented and technically proficient Business Intelligence (BI) Engineer with strong Tableau expertise to support data analytics, dashboard development, and reporting initiatives. The ideal candidate has a solid background in SQL, data modeling, and visualization, with experience transforming raw data into actionable insights for business stakeholders Key Responsibilities • Design, build, and maintain Tableau dashboards and visualizations that communicate key business metrics. • Collaborate with business analysts, data engineers, and stakeholders to gather requirements and transform them into technical solutions. • Write and optimize SQL queries to extract, transform, and load data from various sources. • Support data quality, validation, and integrity across reports and dashboards. • Develop and maintain data models and ETL pipelines for BI use cases. • Perform ad hoc analyses and provide insights to business teams across departments (e.g., Marketing, Finance, Sales). • Assist in user training and documentation of BI solutions. • Participate in code reviews, version control, and agile sprint ceremonies (if applicable) Required Qualifications • 3-5 years of experience in BI engineering or data analytics roles. • Proficiency in Tableau (Desktop and Server) creating interactive dashboards, storyboards, and advanced charts. • Strong knowledge of SQL (PostgreSQL, MySQL, SQL Server, etc.) • Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery, etc.) • Familiarity with ETL tools (e.g., Talend, Informatica, Apache Airflow, dbt) is a plus. • Understanding of data governance and security best practices. • Ability to translate business needs into scalable BI solutions Nice to Have: • Exposure to cloud platforms like AWS, Azure, or GCP. • Knowledge of Agile/Scrum methodology. • Experience in performance tuning of dashboards and SQL queries Preferred candidate profile
Posted 2 weeks ago
3.0 - 5.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Key Responsibilities : - Design, build, and maintain scalable and robust data pipelines and ETL workflows using GCP services. - Work extensively with BigQuery, Cloud Storage, Cloud Dataflow, and other GCP components to ingest, process, and transform large datasets. - Leverage big data frameworks such as Apache Spark and Hadoop to process structured and unstructured data efficiently. - Develop and optimize SQL queries and Python scripts for data transformation and automation. - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. - Implement best practices for data quality, monitoring, and alerting for data workflows. - Ensure compliance with data governance policies, including data privacy, security, and regulatory standards. - Continuously improve system performance and reliability by identifying and resolving bottlenecks or inefficiencies. - Participate in code reviews, architecture discussions, and technical planning. Required Qualifications : - Bachelors degree in Computer Science, Information Technology, or a related technical field. - 3+ years of hands-on experience in data engineering, with a focus on large-scale data processing and cloud technologies. - Strong expertise in Google Cloud Platform (GCP), particularly BigQuery, Cloud Composer, Cloud Storage, Dataflow, and Pub/Sub. - Solid knowledge of SQL, Python, and scripting for automation and data manipulation. - Practical experience with Apache Spark, Hadoop, or similar distributed computing frameworks. - Familiarity with data modeling, warehousing concepts, and data pipeline orchestration. - Understanding of data privacy, security, and governance in cloud environments. - Excellent problem-solving skills and ability to work in a collaborative team environment. Preferred Skills (Good to Have) : - GCP Certification (e.g., Professional Data Engineer) - Experience with CI/CD pipelines and infrastructure-as-code tools (e.g., Terraform) - Exposure to Airflow or Cloud Composer for orchestration - Experience working in Agile development environments
Posted 2 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Description: To work on Data Analytical to triage and investigate data quality and data pipeline exceptions and reporting issues. Requirements: This role will support Data Operations and Reporting related projects but also will be helping with other projects as well if needed. In this role, you will leverage your strong analytical skills to triage and investigate data quality and data pipeline exceptions and reporting issues. The ideal candidate should be able to work independently and actively engage other functional teams as needed. This role requires researching transactions and events using large amounts of data. Technical Experience/Qualifications: • At least 5 years of experience in software development • At least 5 years of SQL experience in any RDBMS • Minimum 5 years of experience in Python • Strong analytical and problem-solving skill • Strong communication skill • Strong experience with data modeling • Strong experience in data analysis and reporting. • Experience with version control tools such as GitHub etc. • Experience with shell scripting and Linux • Knowledge of agile and scrum methodologies • Preferred experience in Hive SQL or related technologies such as Big Query etc. • Preferred experience in Big data technologies like Hadoop, AWS/GCP, S3, HIVE, Impala, HDFS, Spark, MapReduce • Preferred experience in reporting tools such as Looker or Tableau etc. • Preferred experience in finance and accounting but not required Job Responsibilities: Responsibilities: • Develop SQL queries as per technical requirements • Investigate and fix day to day data related issues • Develop test plan and execute test script • Data validation and analysis • Develop new reports/dashboard as per technical requirements • Modify existing reports/dashboards for bug fixes and enhancements • Develop new ETL scripts and modify existing in case of bug fixes and enhancements • Monitoring of ETL processes and fix issues in case of failure • Monitor scheduled jobs and fix issues in case of failure • Monitor data quality alerts and act on it What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 2 weeks ago
5.0 - 8.0 years
11 - 14 Lacs
Gurugram
Work from Office
Senior Web Analyst - 5 years GCP experience ideally in Ecommerce - Deep understanding of BigQuery - Experience of working with large data volumes (100bn row tables) - Working experience of GA4 event tables - Performed A/B testing and insight generation - Excellent communication skills Role & responsibilities
Posted 2 weeks ago
8.0 - 12.0 years
15 - 25 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Must-Have Skills: 1. 12+ years of experience in marketing technology, data integration, and privacy management within Adobe's suite of tools. 2. Experience in Server side implementation, Adobe or Google tech stack 2. Proficient in Adobe Experience Platform (AEP), Adobe Real-Time CDP, Adobe Analytics. 3. Strong understanding of Adobe Experience Data Model (XDM) and Event Schemas. 4. Knowledge of identity resolution techniques and technologies. 5. Deep understanding of data privacy laws, including GDPR and PIID, and how they apply within Adobe systems. 6. Strong API integration experience, specifically with Adobe RTCDP and Workfront. 7. Strong understanding of data pipelines, data modelling, and data governance best practices. 8. Deep understanding of data privacy laws, including GDPR and PIID, and how they apply within Adobe systems. 9. Hands-on experience with AEP SDK Implementation integration ensuring optimal tracking and data collection for mobile and web applications. 10. Familiarity with the broader marketing technology landscape, including integration with various marketing, analytics, and personalization platforms. 11. Effective communication skills, adept at presenting complex insights to stakeholders in a clear and impa
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France