Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
2 - 2 Lacs
India
On-site
Job Title : HR Recruiter (Female) – Overseas Healthcare Hiring Location : Hegde Nagar Bangalore Industry : Overseas Recruitment (Healthcare Sector) Target Country : Saudi Arabia (Primary), UAE, and GCC Working Days : Monday to Saturday Salary : ₹18,000 – ₹21,000 (Based on experience) Job Overview We are seeking a passionate and dynamic HR Recruiter to join our team specializing in healthcare recruitment for overseas placements , primarily for Saudi Arabia. The ideal candidate will have hands-on experience in sourcing, screening, and onboarding nurses and medical professionals for international clients. Key Responsibilities Source and screen qualified candidates (BSc/GNM Nurses, Technicians, etc.) through job portals, social media, and internal databases Coordinate interviews and follow up with candidates and clients Manage documentation for visa, dataflow, and Prometric processes Maintain candidate database and regularly update application statuses Collaborate with clients and handle end-to-end recruitment for healthcare projects Ensure timely onboarding and deployment of selected candidates Maintain professional communication with candidates throughout the recruitment cycle Requirements Bachelor’s degree (Any discipline; HR/Management preferred) Minimum 6–2 years of experience in healthcare or Gulf hiring Strong understanding of overseas recruitment processes Familiarity with medical licensing procedures like Prometric, Dataflow, MOH/HAAD/DHA is a plus Excellent communication & interpersonal skills Ability to handle multiple openings and meet tight deadlines Knowledge of recruitment tools like Naukri, Indeed, LinkedIn, etc. What We Offer Fixed Salary: ₹18,000 to ₹20,000 (based on profile) Performance incentives for each successful hire Opportunity to grow within the international recruitment sector Professional and supportive work environment To Apply , send your resume to: blr@newcalicuttravels.com Contact : +91 96064 59670 | +91 95135 44441 | +91 84343 44444 Job Type: Full-time Pay: ₹18,775.64 - ₹20,518.27 per month Benefits: Health insurance Experience: Recruiting: 1 year (Preferred) Language: Malayalam (Required) Work Location: In person
Posted 2 weeks ago
4.0 years
4 - 6 Lacs
Noida
On-site
4+ years of industry experience in the field of Data Engineering. Proficient in Google Cloud Platform (GCP) services such as Dataflow, BigQuery, Cloud Storage and Pub/Sub. Strong understanding of data pipeline architectures and ETL processes. Experience with Python programming language in terms of data processing. Knowledge of SQL and experience with relational databases. Familiarity with version control systems like Git. Ability to analyze, troubleshoot, and resolve complex data pipeline issues. Software engineering experience in optimizing data pipelines to improve performance and reliability. Continuously optimize data pipeline efficiency and reduce operational costs and reduce number of issues/failures Automate repetitive tasks in data processing and management Experience in monitoring and alerting for Data Pipelines. Continuously improve data pipeline reliability through analysis and testing Perform SLA-oriented monitoring for critical pipelines and provide suggestions as well implement post business approval for SLA adherence if needed. Monitor performance and reliability of GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs. Maintain infrastructure reliability for GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs. Conduct post-incident reviews and implement improvements for data pipelines. Develop and maintain documentation for data pipeline systems and processes. Excellent communication and documentation skills. Strong problem-solving and analytical skills. Open to work in a 24X7 shift.
Posted 2 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Us : At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Required Skills :4+ years of industry experience in the field of Data Engineering .Proficient in Google Cloud Platform (GCP) services such as Dataflow, BigQuery, Cloud Storage and Pub/Sub .Strong understanding of data pipeline architectures and ETL processes .Experience with Python programming language in terms of data processing .Knowledge of SQL and experience with relational databases .Familiarity with version control systems like Git .Ability to analyze, troubleshoot, and resolve complex data pipeline issues .Software engineering experience in optimizing data pipelines to improve performance and reliability .Continuously optimize data pipeline efficiency and reduce operational costs and reduce number of issues/failure sAutomate repetitive tasks in data processing and managemen tExperience in monitoring and alerting for Data Pipelines .Continuously improve data pipeline reliability through analysis and testin gPerform SLA-oriented monitoring for critical pipelines and provide suggestions as well implement post business approval for SLA adherence if needed .Monitor performance and reliability of GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs .Maintain infrastructure reliability for GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs .Conduct post-incident reviews and implement improvements for data pipelines .Develop and maintain documentation for data pipeline systems and processes .Excellent communication and documentation skills .Strong problem-solving and analytical skills .Open to work in a 24X7 shif t
Posted 2 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In Your Role, You Will Be Responsible For Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Preferred Technical And Professional Experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence.
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
DevLabs Technology is looking for below position on contractual basis, seeking your support. Role : DevOps Senior Engineer. Location : Pune Required Past Experience: ● 4 to 8 years of demonstrated relevant experience deploying, configuring and supporting public cloud infrastructure (GCP as primary), IaaS and PaaS. ● Experience in configuring and managing the GCP infrastructure environment components ● Foundation components Networking (VPC, VPN, Interconnect, Firewall and Routes), IAM, Folder Structure, Organization Policy, VPC Service Control, Security Command Centre, etc. ● Application Components: BigQuery, Cloud Composer, Cloud Storage, Google Kubernetes Engine (GKE), Compute Engine, Cloud SQL, Cloud Monitoring, Dataproc, Data Fusion, Big Table, Dataflow, etc. ● Operational Components Audit Logs, Cloud Monitoring, Alerts, Billing Exports, etc. ● Security Components: KMS, Secrets Manager, etc. ● Experience with infrastructure automation using Terraform. ● Experience in designing and implementing CI/CD pipelines with Cloud Build, Jenkins, GitLab, Bitbucket Pipelines, etc., and source code management tools like Git. ● Experience with scripting Shell Scripting and Python Required Skills and Abilities: ● Mandatory Skills GCP Networking & IAM, Terraform, Shell Scripting/Python Scripting, CI/CD Pipelines ● Secondary Skills: Composer, BigQuery, GKE, Dataproc, GCP Networking ● Good To Have – Certifications in any of the following: Cloud DevOps Engineer, Cloud Security Engineer, Cloud Network Engineer ● Good verbal and written communication skills. ● Strong Team Player
Posted 2 weeks ago
6.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Work in close partnership with the business leadership team to execute the analytics agenda Identify and incubate best-in-class external partners to drive delivery on strategic projects Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to senior leaders Technical experience in roles in best-in-class analytics practices Experience deploying new analytical approaches in a complex and highly matrixed organization Savvy in usage of the analytics techniques to create business impacts As part of the Global MSC (Mondelez Supply Chain) Data & Analytics team, you will support our business to uncover trends that can drive long-term business results. In this role, you will be a key technical leader in developing our cutting-edge Supply Chain Data Product ecosystem. You'll have the opportunity to design, build, and automate data ingestion, harmonization, and transformation processes, driving advanced analytics, reporting, and insights to optimize Supply Chain performance across the organization. You will play an instrumental part in engineering robust and scalable data solutions, acting as a hands-on expert for Supply Chain data, and contributing to how these data products are visualized and interacted with. What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: SAP Data Expertise: Deep hands-on experience in extracting, transforming, and modeling data from SAP ECC/S4HANA (modules like MM, SD, PP, QM, FI/CO) and SAP BW/HANA. Proven ability to understand SAP data structures and business processes within Supply Chain. Cloud Data Engineering (GCP Focused): Strong proficiency and hands-on experience in data warehousing solutions and data engineering services within the Google Cloud Platform (GCP) ecosystem (e.g., BigQuery, Dataflow, Dataproc, Cloud Composer, Pub/Sub). Data Pipeline Development: Design, build, and maintain robust and efficient ETL/ELT processes for data integration, ensuring data accuracy, integrity, and timeliness. BI & Analytics Enablement: Collaborate with data scientists, analysts, and business users to provide high-quality, reliable data for their analyses and models. Support the development of data consumption layers, including dashboards (e.g., Tableau, Power BI). Hands-on experience with Databricks (desirable): ideally deployed on GCP or with GCP integration for large-scale data processing, Spark-based transformations, and advanced analytics. System Monitoring & Optimization (desirable): Monitor data processing systems and pipelines to ensure efficiency, reliability, performance, and uptime; proactively identify and resolve bottlenecks. Industry Knowledge: Solid understanding of the consumer goods industry, particularly Supply Chain processes and relevant key performance indicators (KPIs). What extra ingredients you will bring: Excellent communication and collaboration skills to facilitate effective teamwork and Supply Chain stakeholders’ engagement. Ability to explain complex data concepts to both technical and non-technical individuals. Experience delegating work and assignments to team members and guiding them through technical issues and challenges. Ability to thrive in an entrepreneurial, fast-paced setting, managing complex data challenges with a solutions-oriented approach. Strong problem-solving skills and business acumen, particularly within the Supply Chain domain. Experience working in Agile development environments with a Product mindset is a plus. Education / Certifications: Bachelor's degree in Information Systems/Technology, Computer Science, Analytics, Engineering, or a related field. 6+ years of hands-on experience in data engineering, data warehousing, or a similar technical role, preferably in CPG or manufacturing with a strong focus on Supply Chain data. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About the Role Mandatory to have knowledge on any ETL Tool ONLY amongst (Talend, Datastage, Informatica ISS (Cloud version), Data Fusion, Data Flow, Data Proc). Responsibilities Mandatory SQL PL/SQL Scripting Experience Expert in Python, Data Flow, Pubsub, Big Query, CICD Must have good Experience/knowledge on GCP components like GCS, BigQuery, AirFlow, Cloud SQL, PubSub/Kafka, DataFlow and Google Cloud SDK Should have experience on any of the RDBMS GCP Data Engineer certifications is an added advantage Hadoop knowledge, NiFi/Kafka experience Strong Scheduler knowledge - Preferable Control M or 1 mandatory compulsory (UC4 Atomic, Airflow Composer, Control M)
Posted 2 weeks ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4–6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with the Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Posted 2 weeks ago
12.0 years
0 Lacs
Madurai, Tamil Nadu, India
On-site
Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Data Engineer – GCP (5+ Years Experience) Location: Hyderabad / Bangalore Experience: 5+ Years Job Overview: We are seeking a highly skilled and motivated Data Engineer with a strong background in Google Cloud Platform (GCP) and data processing frameworks. The ideal candidate will have hands-on experience in building and optimizing data pipelines, architectures, and data sets using GCP services like BigQuery, Dataflow, Pub/Sub, GCS, and Cloud Composer. Key Responsibilities: Design, build, and maintain scalable and efficient data pipelines on GCP. Implement data ingestion, transformation, and orchestration using GCP services: BigQuery, Dataflow, Pub/Sub, GCS, and Cloud Composer . Write complex and optimized SQL queries for data transformation and analytics. Develop Python scripts for custom transformations and pipeline logic. Orchestrate workflows using Apache Airflow (Cloud Composer). Collaborate with DevOps to implement CI/CD pipelines using Jenkins , GitLab , and Terraform . Ensure data quality, governance, and reconciliation across systems. Required Skills: GCP Expertise : BigQuery, Dataflow, Pub/Sub, GCS, Cloud Composer Languages : Advanced SQL, Python (strong scripting and data transformation experience) DevOps & IaC : Basic experience with Terraform, Jenkins, and GitLab Data Orchestration : Strong experience with Apache Airflow Nice-to-Have Skills: Containerization & Cluster Management : GKE (Google Kubernetes Engine) Big Data Ecosystem : Bigtable, Kafka, Hadoop CDC/Data Sync : Oracle GoldenGate Distributed Processing : PySpark Data Auditing : Data Reconciliation frameworks or strategies
Posted 2 weeks ago
0 years
12 - 15 Lacs
Hyderabad, Telangana, India
On-site
About The Opportunity Join a dynamic leader in the Cloud Computing and Data Engineering sector, specializing in advanced, on-premise to cloud migration and big data solutions. Our fast-paced organization leverages cutting-edge Google Cloud Platform (GCP) technologies to architect, implement, and optimize robust data pipelines. We are seeking a passionate and skilled GCP Data Engineer to drive innovative data solutions onsite in India. Role & Responsibilities Design and develop end-to-end ETL pipelines using GCP services like Dataflow, BigQuery, and Cloud Storage. Collaborate with cross-functional teams to architect scalable, efficient data solutions and integrate them with downstream analytics and business intelligence tools. Implement data quality, transformation, and governance frameworks to ensure reliability across data workflows. Optimize performance and monitor production pipelines, troubleshooting issues on the GCP infrastructure. Utilize Infrastructure as Code techniques (e.g., Terraform) to automate resource provisioning and environment setups. Maintain documentation and best practices to ensure continuous improvements and operational excellence. Skills & Qualifications Must-Have Proven expertise with the Google Cloud Platform including core services (Dataflow, BigQuery, Cloud Storage). Strong programming skills in Python with a solid grounding in SQL and experience in building ETL pipelines. Hands-on experience with Infrastructure as Code tools such as Terraform for managing cloud resources. Preferred Familiarity with data orchestration tools like Apache Beam or Airflow for streamlined pipeline management. Experience in containerization and orchestration using Docker and Kubernetes. Understanding of cloud security best practices and compliance requirements. Benefits & Culture Highlights Be part of an innovative, collaborative, and technically advanced team that values continuous learning. Enjoy a supportive, results-driven work culture with opportunities for growth and on-site engagement in a vibrant professional environment. Skills: gcp,python,sql,terraform,data,data engineer,google cloud platform,data flow,airflow,cloud sql
Posted 2 weeks ago
0 years
12 - 15 Lacs
Pune, Maharashtra, India
On-site
About The Opportunity Join a dynamic leader in the Cloud Computing and Data Engineering sector, specializing in advanced, on-premise to cloud migration and big data solutions. Our fast-paced organization leverages cutting-edge Google Cloud Platform (GCP) technologies to architect, implement, and optimize robust data pipelines. We are seeking a passionate and skilled GCP Data Engineer to drive innovative data solutions onsite in India. Role & Responsibilities Design and develop end-to-end ETL pipelines using GCP services like Dataflow, BigQuery, and Cloud Storage. Collaborate with cross-functional teams to architect scalable, efficient data solutions and integrate them with downstream analytics and business intelligence tools. Implement data quality, transformation, and governance frameworks to ensure reliability across data workflows. Optimize performance and monitor production pipelines, troubleshooting issues on the GCP infrastructure. Utilize Infrastructure as Code techniques (e.g., Terraform) to automate resource provisioning and environment setups. Maintain documentation and best practices to ensure continuous improvements and operational excellence. Skills & Qualifications Must-Have Proven expertise with the Google Cloud Platform including core services (Dataflow, BigQuery, Cloud Storage). Strong programming skills in Python with a solid grounding in SQL and experience in building ETL pipelines. Hands-on experience with Infrastructure as Code tools such as Terraform for managing cloud resources. Preferred Familiarity with data orchestration tools like Apache Beam or Airflow for streamlined pipeline management. Experience in containerization and orchestration using Docker and Kubernetes. Understanding of cloud security best practices and compliance requirements. Benefits & Culture Highlights Be part of an innovative, collaborative, and technically advanced team that values continuous learning. Enjoy a supportive, results-driven work culture with opportunities for growth and on-site engagement in a vibrant professional environment. Skills: gcp,python,sql,terraform,data,data engineer,google cloud platform,data flow,airflow,cloud sql
Posted 2 weeks ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About Client: Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Title:-Java Developer Location : Bengaluru,pune Experience : 7 +Years Job Type : Contract to hire . Notice Period :- Immediate joiners. Detailed JD: Bachelors in computer science, Engineering, or equivalent experience ● 7+ years of experience in core JAVA, Spring Framework (Required) ● 2 years of Cloud experience (GCP, AWS, Azure, GCP preferred ) (Required) ● Experience in big data processing, on a distributed system. (required) ● Experience in databases RDBMS, NoSQL databases Cloud natives. (Required) ● Experience in handling various data formats like Flat file, jSON, Avro, xml etc with defining the schemas and the contracts. (required) ● Experience in implementing the data pipeline (ETL) using Dataflow( Apache beam) ● Experience in Microservices and integration patterns of the APIs with data processing. ● Experience in data structure, defining and designing the data models
Posted 2 weeks ago
0.0 - 3.0 years
25 - 35 Lacs
Madurai, Tamil Nadu
On-site
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3–5 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,500,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person
Posted 2 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcree LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcree LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 2 weeks ago
6.0 - 8.0 years
0 Lacs
India
On-site
Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Role Overview: We are seeking a highly skilled GCP Data Architect with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in delivering scalable, secure, and efficient data platforms. Key Responsibilities: Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements. Define data models, storage strategies, data ingestion, processing, and consumption frameworks. Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer. Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures. Design and implement data governance, security, and compliance frameworks for cloud-based data platforms. Optimize data workflows, query performance, and storage costs in the GCP environment. Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP. Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation. Provide technical leadership and mentoring to data engineering teams. Required Skills & Experience: 6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP. Strong expertise in GCP data services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog. Proficient in data modeling, data warehousing concepts, ETL/ELT pipelines, and big data processing frameworks. Experience with SQL, Python, and Terraform (preferred) for infrastructure as code. Hands-on experience in data security, encryption, access control, and governance on GCP. Experience in integrating with real-time data pipelines and event-driven architectures. Strong understanding of DevOps, CI/CD pipelines for data workflows, and cloud cost optimization. GCP Professional Data Engineer / Cloud Architect certification is a plus. Good to Have : Exposure to AI/ML workflows, data preparation for ML models. Experience with third-party tools like Apache Airflow, Looker, or Dataplex. Knowledge of other cloud platforms (AWS, Azure) for hybrid/multi-cloud strategies. Educational Qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Why Join Us: Work on cutting-edge data transformation programs at scale. Opportunity to architect high-impact solutions in a collaborative, innovation-driven environment. Engage with a fast-growing team focused on data-driven business value. Job Type: Full-time Work Location: In person
Posted 2 weeks ago
6.0 years
0 Lacs
India
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Work in close partnership with the business leadership team to execute the analytics agenda Identify and incubate best-in-class external partners to drive delivery on strategic projects Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to senior leaders Technical experience in roles in best-in-class analytics practices Experience deploying new analytical approaches in a complex and highly matrixed organization Savvy in usage of the analytics techniques to create business impacts As part of the Global MSC (Mondelez Supply Chain) Data & Analytics team, you will support our business to uncover trends that can drive long-term business results. In this role, you will be a key technical leader in developing our cutting-edge Supply Chain Data Product ecosystem. You'll have the opportunity to design, build, and automate data ingestion, harmonization, and transformation processes, driving advanced analytics, reporting, and insights to optimize Supply Chain performance across the organization. You will play an instrumental part in engineering robust and scalable data solutions, acting as a hands-on expert for Supply Chain data, and contributing to how these data products are visualized and interacted with. What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: SAP Data Expertise: Deep hands-on experience in extracting, transforming, and modeling data from SAP ECC/S4HANA (modules like MM, SD, PP, QM, FI/CO) and SAP BW/HANA. Proven ability to understand SAP data structures and business processes within Supply Chain. Cloud Data Engineering (GCP Focused): Strong proficiency and hands-on experience in data warehousing solutions and data engineering services within the Google Cloud Platform (GCP) ecosystem (e.g., BigQuery, Dataflow, Dataproc, Cloud Composer, Pub/Sub). Data Pipeline Development: Design, build, and maintain robust and efficient ETL/ELT processes for data integration, ensuring data accuracy, integrity, and timeliness. BI & Analytics Enablement: Collaborate with data scientists, analysts, and business users to provide high-quality, reliable data for their analyses and models. Support the development of data consumption layers, including dashboards (e.g., Tableau, Power BI). Hands-on experience with Databricks (desirable): ideally deployed on GCP or with GCP integration for large-scale data processing, Spark-based transformations, and advanced analytics. System Monitoring & Optimization (desirable): Monitor data processing systems and pipelines to ensure efficiency, reliability, performance, and uptime; proactively identify and resolve bottlenecks. Industry Knowledge: Solid understanding of the consumer goods industry, particularly Supply Chain processes and relevant key performance indicators (KPIs). What extra ingredients you will bring: Excellent communication and collaboration skills to facilitate effective teamwork and Supply Chain stakeholders’ engagement. Ability to explain complex data concepts to both technical and non-technical individuals. Experience delegating work and assignments to team members and guiding them through technical issues and challenges. Ability to thrive in an entrepreneurial, fast-paced setting, managing complex data challenges with a solutions-oriented approach. Strong problem-solving skills and business acumen, particularly within the Supply Chain domain. Experience working in Agile development environments with a Product mindset is a plus. Education / Certifications: Bachelor's degree in Information Systems/Technology, Computer Science, Analytics, Engineering, or a related field. 6+ years of hands-on experience in data engineering, data warehousing, or a similar technical role, preferably in CPG or manufacturing with a strong focus on Supply Chain data. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science
Posted 2 weeks ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities: Job Title: Cloud Data Engineer (AWS/Azure/Databricks/GCP) Experience:3-8 years in Data Engineering Job Description: We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities: - Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. - Implement data ingestion and transformation processes to facilitate efficient data warehousing. - Utilize cloud services to enhance data processing capabilities: - AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. - Azure: Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. - GCP: Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. - Optimize Spark job performance to ensure high efficiency and reliability. - Stay proactive in learning and implementing new technologies to improve data processing frameworks. - Collaborate with cross-functional teams to deliver robust data solutions. - Work on Spark Streaming for real-time data processing as necessary. Qualifications: - 3-8 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. - Proven experience with data ingestion, transformation, and data warehousing. - In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): - Demonstrated ability in performance optimization of Spark jobs. - Strong problem-solving skills and the ability to work independently as well as in a team. - Cloud Certification (AWS, Azure, or GCP) is a plus. - Familiarity with Spark Streaming is a bonus. Mandatory skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Years of experience required: 3-8 years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Technology, Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills PySpark, Python (Programming Language) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 2 weeks ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title : GCP Data Architect Location : Madurai Experience : 12+ Years Notice Period : Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3-5 years of hands-on experience in GCP Data Service Proficient in : BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership (ref:hirist.tech)
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The AIML Architect-Dataflow, BigQuery plays a crucial role within the organization by focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. Your primary responsibility will involve combining advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that improve decision-making processes across various departments. Building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in data workflows will be key. Collaboration with data engineers, data scientists, and application developers is essential to align technical vision with business goals. Your expertise in cloud-native architectures will be instrumental in driving innovation, efficiency, and insights from vast datasets. The ideal candidate will have a strong background in data processing and AI/ML methodologies and be adept at translating complex technical requirements into scalable solutions that meet the organization's evolving needs. Responsibilities: - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms to extract insights from large datasets. - Optimize data storage and retrieval processes to enhance performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Align data workflows with business objectives through collaboration with cross-functional teams. - Conduct technical evaluations of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship to junior data engineers and analysts. - Stay updated with industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Requirements: - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, specifically BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience implementing machine learning solutions in cloud environments. - Proficient in programming languages like Python, Java, or Scala. - Expertise in SQL and query optimization techniques. - Familiarity with big data workloads and distributed computing. - Knowledge of modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous.,
Posted 2 weeks ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Position: Capability Lead – GCP (Director/ Enterprise Architect) Location: Chennai, Hyderabad, Bangalore or Noida (Hybrid)- No remote option available Duration: Full Time Reporting : Practice Head Budget: 30- 65 LPA (Depending on level of expertise) Notice Period: Immediate Joiner/ Currently Serving/ Notice is less than 60 days Level of experience: 12+ Years Shift Timings: 2 pm -11 pm IST. Overlap with EST time zone of 2 pm Job Summary As a key member of the Data business leadership team, the role will be responsible for building and expanding the Google Cloud Platform data and analytics capability within the organization. This individual will drive technical excellence, innovative solution development, and successful delivery of GCP-based data initiatives. The role requires close collaboration with clients, delivery teams, GCP alliance partners, and internal stakeholders to grow GCP offerings, build talent pipelines, and ensure delivery excellence. Areas of Responsibility 1. Offering and Capability Development Design and enhance GCP-based data platform offerings and accelerators Define architectural standards, best practices, and reusable components Collaborate with alliance teams to strengthen the strategic partnership. 2. Technical Leadership Provide architectural guidance for data solutions on GCP Lead solutioning for proposals, RFIs, and RFPs that involve GCP services. Conduct technical reviews to ensure alignment with GCP architecture best practices Act as the escalation point for complex architecture or engineering challenges 3. Delivery Oversight Support project delivery teams with deep technical expertise in GCP Drive project quality, performance optimization, and technical risk mitigation Ensure best-in-class delivery aligned with GCP’s security, performance, and cost standards. 4. Talent Development Build and lead a high-performing GCP data engineering and architecture team Define certification and upskilling paths aligned with GCP learning programs Mentor team members and foster a culture of technical excellence and knowledge sharing 5. Business Development Support Collaborate with sales and pre-sales teams to position solutions effectively Assist in identifying new opportunities within existing and new accounts Participate in client presentations, solution demos, and technical workshops 6. Thought Leadership and Innovation Develop whitepapers, blogs, and technical assets to showcase GCP leadership Stay current on market updates and innovations in the data engineering landscape Contribute to internal innovation initiatives and PoCs involving GCP Job Requirements 12–15 years of experience in Data Engineering & Analytics, with 3–5 years of deep GCP expertise. Proven experience leading data platforms using GCP technologies (BigQuery, Dataflow, Dataproc, Vertex AI, Looker), Containerization (Kubernetes, Docker), API-based microservices architecture, CI/CD pipelines, and infrastructure-as-code tools like Terraform Experience with tools such as DBT, Airflow, Informatica, Fivetran, and Looker/Tableau and programming skills in languages such as PySpark, Python, Java, or Scala Architectural best practices in cloud around user management, data privacy, data security, performance and other non-functional requirements Familiarity with building AI/ML models on cloud solutions built in GCP GCP certifications preferred (e.g., Professional Data Engineer, Professional Cloud Architect) Exposure to data governance, privacy, and compliance practices in cloud environments Strong presales, client engagement, and solution architecture experience Excellent communication and stakeholder management skills Prior experience in IT consulting, system integration, or technology services environment About Mastech InfoTrellis Mastech InfoTrellis is the Data and Analytics unit of Mastech Digital. At Mastech InfoTrellis, we have built intelligent Data Modernization practices and solutions to help companies harness the true potential of their data. Our expertise lies in providing timely insights from your data to make better decisions…FASTER. With our proven strategies and cutting-edge technologies, we foster intelligent decision-making, increase operational efficiency, and impact substantial business growth. With an unwavering commitment to building a better future, we are driven by the purpose of transforming businesses through data-powered innovation. (Who We Are | Mastech InfoTrellis) Mastech Digital is an Equal Opportunity Employer - All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.
Posted 2 weeks ago
10.0 years
4 - 8 Lacs
Madurai
On-site
Job Location: Madurai Job Experience: 10-20 Years Model of Work: Work From Office Technologies: GCP Functional Area: Software Development Job Summary: Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client in USA and its a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles About our Talent Acquisition Team: Arumugam Veera leads the Talent Acquisition function for both TechMango and Bautomate - SaaS Platform , driving our mission to build high-performing teams and connect top talent with exciting career opportunities. Feel free to connect with him on LinkedIn : https://www.linkedin.com/in/arumugamv/ Follow our official TechMango LinkedIn page for the latest job updates and career opportunities: https://www.linkedin.com/company/techmango-technology-services-private-limited/ Looking forward to connecting and helping you explore your next great opportunity with us!
Posted 2 weeks ago
0 years
5 - 9 Lacs
Chennai
On-site
Good knowledge in GCP, BigQuery, SQL Server, Postgres DB Knowledge in Datastream, Cloud Dataflow, Terraform, ETL tool, Writing procedures and functions ,Writing dynamic code , Performance tuning and complex queries , UNIX. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: The DevOps Engineering job is responsible for developing automations across the Technology delivery lifecycle including construction, testing, release and ongoing service management, and monitoring of a product or service within a Technology team. They will be required to continually enhance their skills within a number of specialisms which include CI/CD, automation, pipeline development, security, testing, and operational support. This role will carry out some or all of the following activities: The role of the DevOps engineer is to facilitate the application teams across the Bank to deploy and their applications across GCP services like GKE Container, BigQuery, Dataflow, PubSub, Kafka The DevOps Engineer should be the go-to person in case application team faces any issue during Platform adoption, onboarding, deployment and environment troubleshooting. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement. Keep up to date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable End to end accountability for a product or service, identifying and developing the most appropriate Technology solutions to meet customer needs as part of the Customer Journey Liaise with other engineers, architects, and business stakeholders to understand and drive the product or service’s direction. Analyze production errors to define and create tools that help mitigate problems in the system design stage and applying user-defined integrations, improving the user experience. Requirements To be successful in this role, you should meet the following requirements: Bachelor Degree in Computer Science or related disciplines 6 or more years of hands-on development experience building fully self-serve, observable solutions using infrastructure and Policy As A Code Proficiency developing with modern programming languages and and ability to rapidly develop proof-of-concepts Ability to work with geographically distributed and cross-functional teams Expert in code deployment tools (Jenkins, Puppet, Ansible, Git, Selenium, and Chef) Expert in automation tools (CloudFormation, Terraform, shell script, Helm, Ansible) Familiar with Containers (Docker, Docker compose, Kubernetes, GKE) Familiar with Monitoring (DATADOG, Grafana, Prometheus, AppDynamics, New Relic, Splunk) The successful candidate will also meet the following requirements: Good understanding of GCP Cloud or Hybrid Cloud approach implementations Good understanding and experience on MuleSoft / PCF/Any Gateway Server Implementations Hands on experience in Kong API Gateway platform Good understanding and experience on Middleware and MQ areas. Familiar with infrastructure support Apache Gateway, runtime Server Configurations, SSL Cert setup etc You’ll achieve more when you join HSBC.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough