Jobs
Interviews

216 Cloud Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 15 Lacs

pune, bengaluru, delhi / ncr

Work from Office

Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. • You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics

Posted 17 hours ago

Apply

10.0 - 15.0 years

8 - 18 Lacs

bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Develop solutions following established technical design, application development standards, and quality processes in projects. Assess the impacts on technical design because of the changes in functional requirements. Perform independent code reviews and execute unit tests on modules developed by self and other junior team members on the project. Write well-designed, efficient, and testable code. Interact with other stakeholders, not limited to end-user clients, the project manager or scrum master, Business Analysts, offshore development, testing, and other cross-functional teams. Skills Must have Must have 10+ Years of Java Development Experience with 3+ Years Architecture design experience BS/MS degree in Computer Science, Software Engineering, or a related subject Google Cloud Platform Experience Comfortable with practicing TDD and pair programming. Well-versed in DevOps Good knowledge of object-oriented design principles and Hands-on experience with object-oriented programming Good knowledge of Java standard library. Hands-on experience with Spring and/or Spring Boot is a big plus. Experience in agile software development Well versed with Solution Architecture and principles like below, but not limited to SOLID Hexagonal, Ports and Adapter Cloud Native Microservices patterns Experience in Large enterprise System Integrations and Architecture Strong understanding and hands-on experience with design in Scalability, High Availability, Reliability, Resiliency, Secure, and performant systems Should have good presentation, documentation, and communication skills Knowledge of Linux is a plus Knowledge of cloud platforms is a plus Desirable to have knowledge of TOGAF, Zachman frameworks Good to have an understanding of Application security frameworks and standards, eg, OWASP, NIST 4+ progressive years of experience in building and implementing model-driven, enterprise-level business solutions and applications in PRPC Excellent time management and organization skills, as well as the ability to manage multiple competing priorities Exceptional interpersonal skills and the ability to communicate, partner, and collaborate Dedication to achieving outstanding customer results with a team-oriented drive and a demonstrated ability to lead by example Exposure to a variety of technologies, including object-oriented techniques/principles, database design, application & web servers Aptitude to pick up new concepts and technology rapidly; ability to explain it to both business & IT stakeholders Ability to match technology solutions to customer needs Nice to have Banking Domain

Posted 18 hours ago

Apply

3.0 - 8.0 years

12 - 16 Lacs

bengaluru

Work from Office

Job Summary Synechron is seeking a detail-oriented and analytical Python Developer to join our data team. In this role, you will design, develop, and optimize data pipelines, analysis tools, and workflows that support key business and analytical functions. Your expertise in data manipulation, database management, and scripting will enable the organization to enhance data accuracy, efficiency, and insights. This position offers an opportunity to work closely with data analysts and scientists to build scalable, reliable data solutions that contribute directly to business decision-making and operational excellence. Software Requirements Required Skills: Python (version 3.7 or higher) with experience in data processing and scripting Pandas library (experience in large dataset manipulation and analysis) SQL (proficiency in writing performant queries for data extraction and database management) Data management tools and databases such as MySQL, PostgreSQL, or similar relational databases Preferred Skills: Experience with cloud data services (AWS RDS, Azure SQL, GCP Cloud SQL) Knowledge of additional Python libraries such as NumPy, Matplotlib, or Jupyter Notebooks for data analysis and visualization Data pipeline orchestration tools (e.g., Apache Airflow) Version control tools like Git Overall Responsibilities Develop, test, and maintain Python scripts for ETL processes and data workflows Utilize Pandas to clean, analyze, and transform large datasets efficiently Write, optimize, and troubleshoot SQL queries for data extraction, updates, and management Collaborate with data analysts and scientists to create data-driven analytic tools and solutions Automate repetitive data workflows to increase operational efficiency and reduce errors Maintain detailed documentation of data processes, pipelines, and procedures Troubleshoot data discrepancies, pipeline failures, and database-related issues efficiently Support ongoing data quality initiatives by identifying and resolving data inconsistencies Technical Skills (By Category) Programming Languages: Required: Python (3.7+), proficiency with data manipulation and scripting Preferred: Additional scripting languages such as R or familiarity with other programming environments Databases/Data Management: Relational databases: MySQL, PostgreSQL, or similar Experience with query optimization and database schema design Cloud Technologies: Preferred: Basic experience with cloud data services (AWS, Azure, GCP) for data storage and processing Frameworks and Libraries: Pandas, NumPy, Matplotlib, Jupyter Notebooks for data analysis and visualization Airflow or similar orchestration tools (preferred) Development Tools and Methodologies: Git or similar version control tools Agile development practices and collaborative workflows Security Protocols: Understanding of data privacy, confidentiality, and secure coding practices Experience Requirements 3+ years of experience in Python development with a focus on data processing and management Proven hands-on experience in building and supporting ETL workflows and data pipelines Strong experience working with SQL and relational databases Demonstrated ability to analyze and manipulate large datasets efficiently Familiarity with cloud data services is advantageous but not mandatory Day-to-Day Activities Write and enhance Python scripts to perform ETL, data transformation, and automation tasks Design and optimize SQL queries for data extraction and updates Collaborate with data analysts, scientists, and team members during daily stand-ups and planning sessions Investigate and resolve data quality issues or pipeline failures promptly Document data pipelines, workflows, and processes for clarity and future maintenance Assist in developing analytical tools and dashboards for business insights Review code changes through peer reviews and ensure adherence to best practices Participate in continuous improvement initiatives related to data workflows and processing techniques Qualifications Bachelors degree in Computer Science, Data Science, Information Technology, or a related field Relevant certifications or training in Python, data engineering, or database management are a plus Proven track record of working on data pipelines, analysis, and automation projects Professional Competencies Strong analytical and problem-solving skills with attention to detail Effective communication skills, able to collaborate across teams and explain technical concepts clearly Ability to work independently and prioritize tasks effectively Continuous learner, eager to adopt new tools, techniques, and best practices in data processing Adaptability to changing project requirements and proactive in identifying process improvements Focused on delivering high-quality work with a results-oriented approach

Posted 1 day ago

Apply

5.0 - 7.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Who We Are Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Who You Are Our data team serves Zinnia through data engineering, data analysis, and data science. Our goal is to help uncover opportunities and make decisions with data. We partner with all department stakeholders across the company to develop deeper predictors of behavior, develop insights that drive business strategy and build solutions to optimize our internal and external experiences. What Youll Do Overseeing technological choices and implementation of data pipelines and warehousing philosophy Execute and serve as lead and/or SME on cross-organizational and cross-divisional projects automating our data value chain processes Promoting technical best practices throughout the data organization Design data architecture that is simple and maintainable while enabling Data Analysts, Data Scientists, and stakeholders to efficiently work with data. Mentor data team members in architecture and coding techniques. Serve as a source of knowledge for the Data Engineering team for process improvement, automation and new technologies available to enable best-in-class timeliness and data coverage Design data pipelines utilizing ETL tools, event driven software, and other streaming software. Partner with both data scientists and engineers to bring our amazing concepts to reality. This requires learning to speak the language of statisticians as well as software engineers. Ensure reliability in data pipelines and enforce data governance, security and protection of our customers information while balancing tech debt. Demonstrate innovation, customer focus, and experimentation mindsets Partner with product and engineering teams to design data models for downstream data maximization. Evaluate and champion new engineering tools that help us move faster and scale our team What Youll Need A Technical Bachelor/Master&aposs Degree with 5+ years of experience across Data Engineering (Data Pipelining, Warehousing, ETL Tools etc.) Extensive experience with data engineering techniques, Python and using SQL Familiarity and working knowledge of Airflow and dbt You are comfortable and have expertise in data engineering tooling such as Jira, git, buildkite, terraform, airflow, dbt and containers as well as GCP suite, terraform kubernetes, cloud functions You understand standard ETL patterns, modern data warehousing ideas such as data mesh or data vaulting, and data quality practices regarding test driven design and data observability. You enjoy being a high-level architect sometimes, and a low-level coder sometimes You are passionate about all things data: Big data, small data, moving and transforming it, its quality, its accessibility, and delivering value from it to internal and external clients You want ownership to solve for and lead a team to deliver modern and efficient data pipeline components You are passionate about a culture of learning and teaching You love challenging yourself to constantly improve, and sharing your knowledge to empower others You like to take risks when looking for novel solutions to complex problems. If faced with roadblocks, you continue to reach higher to make greatness happen Technologies, you will use: Python for data pipelining and automation. Airbyte for ETL purpose Google Cloud Platform, Terraform, Kubernetes, Cloud SQL, Cloud Functions, BigQuery, DataStore, and more: we keep adopting new tools as we grow! Airflow and dbt for data pipelining Tableau and PowerBI for data visualization and consumer facing dashboards. WHATS IN IT FOR YOU At Zinnia, you collaborate with smart, creative professionals who are dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability. Show more Show less

Posted 2 days ago

Apply

4.0 - 6.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Software Engineer III, Fullstack (Backend Heavy) Team And Responsibilities If you are someone who is passionate about the art of programming, cares about clean and semantic code and keeps a tab on new developments in technology then you are the right fit for us! What you&aposll do : Design, and develop software with high quality and take ownership. Work collaboratively with product management Participate in a full development life cycle including planning and code reviews. Build solutions that can easily scale to the demands of Vimeo traffic bursts. Ensure the best technical design and approach with an aim for continuous improvement. Set high technical standards. Skill and knowledge you should possess: B-Tech / MTech in Computer science or equivalent degree Minimum 4 years of backend development experience with GoLang/PHP/Java and other languages (PHP preferred). Minimum 1 year experience in React. Strong troubleshooting, debugging, and testing skills Very good in algorithms, data structures, time & space complexities and problem solving, in general. Very good knowledge of Object Oriented programming paradigm, design patterns. Sound knowledge in cloud technologies and concepts like CDN, caching, ate limit, latence, throughput Sound knowledge in database and caching technologies like MySQL, Redis, Cloud SQL, Memcache Good knowledge on designing systems, analyzing trade-offs between different choices. Nice to have exposure on various authorization, authentication models and technologies like RBAC, ReBAC, SSO, SCIM etc Nice to have basic understanding of infrastructure technologies like Varnish, HAProxy and alike Willingness to learn and experiment with new technology. About Us: Vimeo (NASDAQ: VMEO) is the world&aposs most innovative video experience platform. We enable anyone to create high-quality video experiences to better connect and bring ideas to life. We proudly serve our community of millions of users from creative storytellers to globally distributed teams at the world&aposs largest companies whose videos receive billions of views each month. Learn more at www.vimeo.com. Vimeo is headquartered in New York City with offices around the world. At Vimeo, we believe our impact is greatest when our workforce of passionate, dedicated people, represents our diverse and global community. Were proud to be an equal opportunity employer where diversity, equity, and inclusion is championed in how we build our products, develop our leaders, and strengthen our culture. Show more Show less

Posted 2 days ago

Apply

4.0 - 8.0 years

15 - 19 Lacs

bengaluru

Work from Office

Bengaluru, India DevOps BCM Industry 25/04/2025 Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. You should have extensive experience with Google Cloud Platform (GCP), Kubernetes, and Docker. role involves working closely with our development and operations teams to ensure seamless integration and deployment of applications. Responsibilities Design, implement, and manage CI/CD pipelines on GCP. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform and Ansible. Manage and optimize Kubernetes clusters for high availability and scalability. Containerize applications using Docker and manage container orchestration. Monitor system performance, troubleshoot issues, and ensure system reliability and security. Collaborate with development teams to ensure smooth and reliable operation of software and systems. Implement and manage logging, monitoring, and alerting solutions. Stay updated with the latest industry trends and best practices in DevOps and cloud technologies. Skills Must have Looking for 6 to 9 years of experience as a DevOps Engineer and a minimum of 4 years of relevant experience in GCP. Bachelor's degree in Computer Science, Engineering, or a related field. Strong expertise in Kubernetes and Docker. Experience with infrastructure as code (IaC) tools such as Terraform and Ansible. Proficiency in scripting languages like Python, Bash, or Go. Familiarity with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Knowledge of networking, security, and database management. Excellent problem-solving skills and attention to detail. Nice to have Strong communication and collaboration skills. Other Languages EnglishC2 Proficient Seniority Senior

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest! Apply : https://customerlabs.freshteam.com/jobs

Posted 2 days ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 2 days ago

Apply

4.0 - 8.0 years

15 - 25 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 15 to 25 LPA Exp: 4 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 2 days ago

Apply

3.0 - 5.0 years

5 - 15 Lacs

pune

Hybrid

Responsibilities: Design, implement, and manage ETL pipelines on Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Composer) . Write complex SQL queries and optimize for BigQuery performance. Work with structured/unstructured data from multiple sources (databases, APIs, streaming). Build reusable data frameworks for transformation, validation, and quality checks. Collaborate with stakeholders to understand business requirements and deliver analytics-ready datasets. Implement best practices in data governance, security, and cost optimization . Requirements: Bachelors in Computer Science, IT, or related field. experience in ETL/Data Engineering . Strong Python & SQL skills. Hands-on with GCP (BigQuery, Dataflow, Composer, Pub/Sub, Dataproc) . Experience with orchestration tools (Airflow preferred). Knowledge of data modeling and data warehouse design. Exposure to CI/CD, Git, DevOps practices is a plus.

Posted 3 days ago

Apply

7.0 - 12.0 years

5 - 15 Lacs

bengaluru

Work from Office

Role - GCP Staff Data Engineer Experience: 8 - 13 years Preferred - Data Engineering Background Location -Bangalore, Chennai, Hyderabad, Kolkata, Pune, Gurgaon Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 8 + years' experience in IT or professional services experience in IT delivery or large-scale IT analytics projects Candidates must have expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets Required Skills Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB

Posted 3 days ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Participate in the full application development lifecycle for the development of Java applications, Microservices, and reusable components to support overall project objectives. Leverage design patterns, test-driven development (TDD), and behaviour-driven development (BDD) to build software that is reliable and easy to support in production. Must be adaptable to different responsibilities, and possess strong communication skills in order to work effectively with team members and stakeholders. Design and deliver front-to-back technical solutions and integrate into business processes. Participate in hands-on coding, code reviews, architectural decisions, and reviews. Work in an Agile Systems Development Life Cycle. Skills Must have Overall 2 to 4 years of experience as a Java Developer 2+ Years of Experience developing in Core Java and Spring Framework Google Cloud Platform Experience Worked with the latest features of Java 8, 11, and 17 in Development Solid understanding of Data Structures Good hands-on coding skills Experience in Kafka or other messaging Knowledge of key APIsJPA, JTA, CDI, etc. Knowledge of various design and architectural patterns Understanding of microservices architecture Containerization solutions (e.g. Docker, Kubernetes, OpenShift) Building tools (e.g., Maven, Gradle) Version Control (e.g., Git) Continuous Integration systems (e.g., TeamCity, Jenkins) English Upper-Intermediate Be well versed with concepts of references, class instances, methods, objects, constructors, mutable and immutable class concepts, functional interfaces, array lists, linked lists, Hashmap, collections, the difference between recoverable and non-recoverable exceptions, Inversion Control, design a data structure that supports Insert, Delete, Search in constant time complexity, etc. Nice to have Banking Domain

Posted 3 days ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Participate in the full application development lifecycle for the development of Java applications, Microservices, and reusable components to support overall project objectives Leverage design patterns, test-driven development (TDD), and behaviour-driven development (BDD) to build software that is reliable and easy to support in production Must be adaptable to different responsibilities, and possess strong communication skills in order to work effectively with team members and stakeholders Design and deliver front-to-back technical solutions and integrate into business processes Participate in hands-on coding, code reviews, architectural decisions, and reviews Work in an Agile Systems Development Life Cycle Skills Must have Overall 6-9 years of experience as a Java Developer 6+ Years of Experience developing in Core Java and Spring Framework Google Cloud Platform Experience Worked with the latest features of Java 8, 11, and 17 in Development Solid understanding of Data Structures Good hands-on coding skills Experience in Kafka or other messaging Knowledge of key APIsJPA, JTA, CDI, etc. Knowledge of various design and architectural patterns Understanding of microservices architecture Containerization solutions (e.g. Docker, Kubernetes, OpenShift) Building tools (e.g., Maven, Gradle) Version Control (e.g., Git) Continuous Integration systems (e.g., TeamCity, Jenkins) English Upper-Intermediate Be well versed with concepts of references, class instances, methods, objects, constructors, mutable and immutable class concepts, functional interfaces, array lists, linked lists, Hashmap, collections, the difference between recoverable and non-recoverable exceptions, Inversion Control, design a data structure that supports Insert, Delete, Search in constant time complexity, etc. Nice to have Banking Domain

Posted 3 days ago

Apply

5.0 - 9.0 years

13 - 17 Lacs

bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Develop solutions following established technical design, application development standards, and quality processes in projects Assess the impacts on technical design because of the changes in functional requirements Perform independent code reviews and execute unit tests on modules developed by self and other junior team members on the project. Write well-designed, efficient, and testable code Interact with other stakeholders, not limited to end-user clients, the project manager or scrum master, Business Analysts, offshore development, testing, and other cross-functional teams Skills Must have Must have 10+ Years of Java Development Experience with 3+ Years Architecture design experience BS/MS degree in Computer Science, Software Engineering, or a related subject Google Cloud Platform Experience Comfortable with practicing TDD and pair programming. Well-versed in DevOps Good knowledge of object-oriented design principles and Hands-on experience with object-oriented programming Good knowledge of Java standard library Hands-on experience with Spring and/or Spring Boot is a big plus. Experience in agile software development Well versed with Solution Architecture and principles like below, but not limited to SOLID Hexagonal, Ports and Adapter Cloud Native Microservices patterns Experience in Large enterprise System Integrations and Architecture Strong understanding and hands-on experience with design in Scalability, High Availability, Reliability, Resiliency, Secure, and performant systems Should have good presentation, documentation, and communication skills Knowledge of Linux is a plus Knowledge of cloud platforms is a plus Desirable to have knowledge of TOGAF, Zachman frameworks Good to have an understanding of Application security frameworks and standards, eg, OWASP, NIST 4+ progressive years of experience in building and implementing model-driven, enterprise-level business solutions and applications in PRPC Excellent time management and organization skills, as well as the ability to manage multiple competing priorities Exceptional interpersonal skills and the ability to communicate, partner, and collaborate Dedication to achieving outstanding customer results with a team-oriented drive and a demonstrated ability to lead by example Exposure to a variety of technologies, including object-oriented techniques/principles, database design, application & web servers Aptitude to pick up new concepts and technology rapidly; ability to explain it to both business & IT stakeholders Ability to match technology solutions to customer needs Nice to have Banking Domain

Posted 3 days ago

Apply

3.0 - 6.0 years

9 - 14 Lacs

bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities Participate in the full application development lifecycle for the development of Java applications, Microservices, and reusable components to support overall project objectives Leverage design patterns, test-driven development (TDD), and behaviour-driven development (BDD) to build software that is reliable and easy to support in production Must be adaptable to different responsibilities, and possess strong communication skills in order to work effectively with team members and stakeholders Design and deliver front-to-back technical solutions and integrate into business processes Participate in hands-on coding, code reviews, architectural decisions, and reviews Work in an Agile Systems Development Life Cycle Skills Must have 8+ years of experience in implementing and testing practices across the full Software Development lifecycle 6+ years of experience in Java Development/Maintenance/Testing Technology knowledge and experience in automation testing using Java language is highly needed Good Experience in BDD Cucumber Good Experience in Selenium Experience with industry-standard test tools (e.g., HP ALM 11). Experience in planning and executing testing across projects. Customer and service orientation to support interaction with team resources and clients. Performance and productivity-oriented to drive toward quality testing outcomes and results Proactively initiate, develop, and maintain effective working relationships with team members Demonstrated ability to work with a variety of people and achieve results Nice to have Banking Domain

Posted 3 days ago

Apply

3.0 - 7.0 years

12 - 16 Lacs

bengaluru

Work from Office

Project description Developing cloud-based compliance state-of-the-art archive products to archive and retain real-time communications data in line with internal and external regulatory requirements. The product is developed in-house in an agile development setup based on business requirements from multiple stakeholder parties. By employing continuous development and deployment principles, the team is aiming to transition from project to product management to support the bank with robust compliance archive solutions for the next decade. For a large global investment bank, we are looking for GCP-qualified cloud engineers to help with the FIC (fixed income and currencies) cloud migrations under Project Cirrus. Knowledge of Financial Services/FIC would be great, but the primary skills we need are in building, migrating, and deploying applications to GCP, Terraform module coding, Google infrastructure, cloud-native services such as GCE, GKE, CloudSQL/Postgres, logging and monitoring, etc., & good written and spoken English, as we would like these engineers to help with knowledge transfer to our existing development & support teams. We would like to place people alongside the engineers they'll be working with in the bank. Responsibilities CI/CD setup, management, and improvementautomate build and deployment pipelines and procedures. Application infrastructure support, including hardware, networks, OS, frameworks, and related tools. Test environments support. Follow internal security operations procedures. Application and infrastructure audit compliance. Deploy application releases, provide SL3 application on-call rota/weekend support. Perform root cause analysis of production errors and resolve technical issues. Cover monitoring and alerting needs. Self-motivated individual to perform tasks on his own. Certificate installations/renewal. Compliance-related changes for the application based on internal compliance requirements. Skills Must have 6+ years of experience as a DevOps/SRE engineer or in a similar software engineering role. 6+ years of experience with Linux diagnosis and support, deep bash knowledge. 5+ years of experience with Python coding. Strong TCP/IP. 5+ years experience with CI/CDJenkins, build/test full flow support (support, development, and improvement), Groovy, Artifactory. Application release cycle experiencerelease engineer, SL3 support. IaCansible, git. Docker experience. Proficiency with monitoring. Basic SQL. Analytical thinking. Good verbal and writing communication skills. Proactive approach to identifying problems, performance bottlenecks, and areas for improvement. Be independent in assigned tasks and projects, work as a team in the responsibility area. Nice to have Banking Domain

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Software Engineer at HSBC, you will play a crucial role in designing, implementing, and managing scalable, secure, and reliable cloud infrastructure on the Google Cloud Platform (GCP). Your responsibilities will include collaborating with development teams to optimize applications for cloud deployment, setting up and configuring various GCP services, and ensuring compliance with security policies and best practices. To excel in this role, you should have proven experience as a Cloud Engineer with a strong focus on GCP. Your expertise should include a deep understanding of cloud architecture and services such as Compute Engine, Kubernetes Engine, Cloud Storage, and BigQuery. Additionally, you will be expected to automate infrastructure provisioning using tools like Terraform, Google Cloud Deployment Manager, or similar, and implement CI/CD pipelines for efficient software delivery. The successful candidate will possess proficiency in scripting languages like Python and Bash, as well as the ability to troubleshoot and resolve issues related to cloud infrastructure and services. Google Cloud certifications, particularly the Google Cloud Professional Cloud Architect certification, are considered a plus. Staying updated with the latest GCP features, services, and best practices is essential for this role. Any knowledge of other cloud platforms like AWS or Azure will be an added advantage. If you are a skilled and experienced Google Cloud Engineer seeking a career where your expertise is valued, HSBC offers an inclusive and diverse environment where employees are respected, valued, and provided with opportunities for continuous professional development and growth. Join us at HSBC and realize your ambitions in a workplace that prioritizes employee well-being and career advancement. For more information about career opportunities at HSBC, visit www.hsbc.com/careers.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As Ford Motor Company embarks on a significant multi-year Platform Lifecycle Management (PLM) program to modernize critical IT applications across the enterprise, you have a unique opportunity to join the team as a GenAI Technical Manager (LL6). In this role, you will play a pivotal part in driving the practical adoption of Generative AI (GenAI) at Ford, with a focus on creating accelerators for the PLM modernization effort and enhancing the Ford Developer Experience (DX). Your expertise will be crucial in leading the technical development and implementation of GenAI solutions within this strategic program. You will collaborate closely with various teams including PLM program leaders, GDIA (Global Data Insight & Analytics), architecture teams, PDOs (Product Driven Organizations), and engineering teams to design, build, and deploy cutting-edge GenAI tools and platforms. Your responsibilities will include leading the technical design, development, testing, and deployment of GenAI solutions, translating GenAI strategy into actionable projects, managing the technical lifecycle of GenAI tools, and overseeing the integration of GenAI capabilities into existing workflows and processes. As a technical expert in GenAI models and frameworks, you will provide guidance to development teams, architects, and stakeholders on best practices, architecture patterns, security considerations, and ethical AI principles. You will stay updated on the evolving GenAI landscape, evaluate new tools and models, and lead the development of GenAI-powered accelerators and tools to automate and streamline processes within the PLM program. Collaboration and stakeholder management will be key aspects of your role, requiring effective communication of complex technical concepts to diverse audiences. You will also lead proof-of-concept projects with emerging GenAI technologies, champion experimentation and adoption of successful tools and practices, and mentor junior team members in GenAI development tasks. To qualify for this role, you should have a Bachelor's or Master's degree in Computer Science, Software Engineering, Artificial Intelligence, or a related field, along with 8-10+ years of experience in software development/engineering with a focus on AI/ML and Generative AI solutions. Deep practical expertise in GenAI, strong software development foundation, and familiarity with enterprise application context are essential qualifications. Preferred qualifications include GCP certifications, experience with Agile methodologies, and familiarity with PLM concepts or the automotive industry. If you are passionate about innovation in the AI space, possess strong analytical and strategic thinking skills, and excel in a fast-paced, global environment, we invite you to join us in shaping the future of AI at Ford Motor Company.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

One of our prestigious clients, a TOP MNC Giant with a global presence, is currently seeking a Lead Enterprise Architect to join their team in Pune, Mumbai, or Bangalore. **Qualifications and Certifications:** **Education:** - Bachelors or masters degree in Computer Science, Information Technology, Engineering, or a related field. **Experience:** - A minimum of 10+ years of experience in data engineering, with at least 4 years of hands-on experience with GCP cloud platforms. - Proven track record in designing and implementing data workflows using GCP services such as BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. **Certifications:** - Google Cloud Professional Data Engineer certification is preferred. **Key Skills:** **Mandatory Skills:** - Advanced proficiency in Python for developing data pipelines and automation. - Strong SQL skills for querying, transforming, and analyzing large datasets. - Hands-on experience with various GCP services including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine, and Kubernetes Engine (GKE). - Familiarity with CI/CD tools like Jenkins, GitHub, or Bitbucket. - Proficiency in Docker, Kubernetes, Terraform, or Ansible for containerization, orchestration, and infrastructure as code (IaC). - Knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer. - Strong understanding of Agile/Scrum methodologies. **Nice-to-Have Skills:** - Experience with other cloud platforms like AWS or Azure. - Familiarity with data visualization tools such as Power BI, Looker, or Tableau. - Understanding of machine learning workflows and their integration with data pipelines. **Soft Skills:** - Strong problem-solving and critical-thinking abilities. - Excellent communication skills to effectively collaborate with both technical and non-technical stakeholders. - Proactive attitude towards innovation and continuous learning. - Ability to work independently and as part of a collaborative team. If you are interested in this opportunity, please reply back with your updated CV and provide the following details: - Total Experience: - Relevant experience in Data Engineering: - Relevant experience in GCP cloud platforms: - Relevant experience as an Enterprise Architect: - Availability to join ASAP: - Preferred location (Pune / Mumbai / Bangalore): We will contact you once we receive your CV along with the above-mentioned details. Thank you, Kavita.A,

Posted 2 weeks ago

Apply

7.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Description Looking for an experienced GCP Cloud/DevOps Engineer and or OpenShift to design, implement, and manage cloud infrastructure and services across multiple environments. This role requires deep expertise in Google Cloud Platform (GCP) services, DevOps practices, and Infrastructure as Code (IaC). Candidate will be deploying, automating, and maintaining high-availability systems, and implementing best practices for cloud architecture, security, and DevOps pipelines. Requirements Bachelor&aposs or master&aposs degree in computer science, Information Technology, or a similar field Must have 7 + years of extensive experience in designing, implementing, and maintaining applications on GCP and OpenShift Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team Key Responsibilities Cloud Infrastructure Design and Deployment: Architect, design, and implement scalable, reliable, and secure solutions on GCP. Deploy and manage GCP services in both development and production environments, ensuring seamless integration with existing infrastructure. Implement and manage core services such as BigQuery, Datafusion, Cloud Composer (Airflow), Cloud Storage, Data Fusion, Compute Engine, App Engine, Cloud Functions and more. Infrastructure as Code (IaC) and Automation Develop and maintain infrastructure as code using Terraform or CLI scripts to automate provisioning and configuration of GCP resources. Establish and document best practices for IaC to ensure consistent and efficient deployments across environments. DevOps And CI/CD Pipeline Development Create and manage DevOps pipelines for automated build, test, and release management, integrating with tools such as Jenkins, GitLab CI/CD, or equivalent. Work with development and operations teams to optimize deployment workflows, manage application dependencies, and improve delivery speed. Security And IAM Management Handle user and service account management in Google Cloud IAM. Set up and manage Secrets Manager and Cloud Key Management for secure storage of credentials and sensitive information. Implement network and data security best practices to ensure compliance and security of cloud resources. Performance Monitoring And Optimization Monitoring & Security: Set up observability tools like Prometheus, Grafana, and integrate security tools (e.g., SonarQube, Trivy). Networking & Storage: Configure DNS, networking, and persistent storage solutions in Kubernetes. Set up monitoring and logging (e.g., Cloud Monitoring, Cloud Logging, Error Reporting) to ensure systems perform optimally. Troubleshoot and resolve issues related to cloud services and infrastructure as they arise. Workflow Orchestration Orchestrate complex workflows using Argo Workflow Engine. Containerization: Work extensively with Docker for containerization and image management. Optimization: Troubleshoot and optimize containerized applications for performance and security. Technical Skills Expertise with GCP and OCP (OpenShift) services, including but not limited to Compute Engine, Kubernetes Engine (GKE), BigQuery, Cloud Storage, Pub/Sub, Datafusion, Airflow, Cloud Functions, and Cloud SQL. Proficiency in scripting languages like Python, Bash, or PowerShell for automation. Familiarity with DevOps tools and CI/CD processes (e.g. GitLab CI, Cloud Build, Azure DevOps, Jenkins) Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Technology Service Specialist, AVP at our Pune location, you will be an integral part of the Technology, Data, and Innovation (TDI) Private Bank team. In this role, you will be responsible for providing 2nd Level Application Support for business applications used in branches, by mobile sales, or via the internet. Your expertise in Incident Management and Problem Management will be crucial in ensuring the stability of these applications. Partnerdata, the central client reference data system in Germany, is a core banking system that integrates many banking processes and applications through numerous interfaces. With the recent migration to Google Cloud (GCP), you will be involved in operating and further developing applications and functionalities on the cloud platform. Your focus will also extend to regulatory topics surrounding partner/client relationships. We are seeking individuals who can contribute to this contemporary and emerging Cloud application area. Key Responsibilities: - Ensure optimum service level to supported business lines - Oversee resolution of incidents and problems within the team - Assist in managing business stakeholder relationships - Define and manage OLAs with relevant stakeholders - Monitor team performance, adherence to processes, and alignment with business SLAs - Manage escalations and work with relevant functions to resolve issues quickly - Identify areas for improvement and implement best practices in your area of expertise - Mentor and coach Production Management Analysts within the team - Fulfill Service Requests, communicate with Service Desk function, and participate in major incident calls - Document tasks, incidents, problems, changes, and knowledge bases - Improve monitoring of applications and implement automation of tasks Skills and Experience: - Service Operations Specialist experience in a global operations context - Extensive experience supporting complex application and infrastructure domains - Ability to manage and mentor Service Operations teams - Strong ITIL/best practice service context knowledge - Proficiency in interface technologies, communication protocols, and ITSM tools - Bachelor's Degree in IT or Computer Science related discipline - ITIL certification and experience with ITSM tool ServiceNow preferred - Knowledge of Banking domain and regulatory topics - Experience with databases like BigQuery and understanding of Big Data and GCP technologies - Proficiency in tools like GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow - Architectural skills for big data solutions and interface architecture Area-Specific Tasks/Responsibilities: - Handle Incident/Problem Management and Service Request Fulfilment - Analyze and resolve incidents escalated from 1st Level Support - Support the resolution of high-impact incidents and escalate when necessary - Provide solutions for open problems and support service transition for new projects/applications Joining our team, you will receive training, development opportunities, coaching from experts, and a culture of continuous learning to support your career progression. We value diversity and promote a positive, fair, and inclusive work environment at Deutsche Bank Group. Visit our company website for more information.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

7 - 14 Lacs

Hyderabad, Pune

Hybrid

Role & responsibilities Key Skills Required are 8 years of handson experience in cloud application architecture with a focus on creating scalable and reliable software systems 8 Years Experience using Google Cloud Platform GCP including but not restricting to services like Bigquery Cloud SQL Fire store Cloud Composer Experience on Security identity and access management Networking protocols such as TCPIP and HTTPS Network security design including segmentation encryption logging and monitoring Network topologies load balancing and segmentation Python for Rest APIs and Microservices Design and development guidance Python with GCP Cloud SQLPostgreSQL BigQuery Integration of Python API to FE applications built on React JS Unit Testing frameworks Python unit test pytest Java junit spock and groovy DevOps automation process like Jenkins Docker deployments etc Code Deployments on VMs validating an overall solution from the perspective of Infrastructure performance scalability security capacity and create effective mitigation plans Automation technologies Terraform or Google Cloud Deployment Manager Ansible Implementing solutions and processes to manage cloud costs Experience in providing solution to Web Applications Requirements and Design knowledge React JS Elastic Cache GCP IAM Managed Instance Group VMs and GKE Owning the endtoend delivery of solutions which will include developing testing and releasing Infrastructure as Code Translate business requirementsuser stories into a practical scalable solution that leverages the functionality and best practices of the HSBC Executing technical feasibility assessments solution estimations and proposal development for moving identified workloads to the GCP Designing and implementing secure scalable and innovative solutions to meet Banks requirements Ability to interact and influence across all organizational levels on technical or business solutions Certified Google Cloud Architect would be an addon Create and own scaling capacity planning configuration management and monitoring of processes and procedures Create put into practice and use cloudnative solutions Lead the adoption of new cloud technologies and establish best practices for them Experience establishing technical strategy and architecture at the enterprise level Experience leading GCP Cloud project delivery Collaborate with IT security to monitor cloud privacy Architecture DevOps data and integration teams to ensure best practices are followed throughout cloud adoption Respond to technical issues and provide guidance to technical team Skills Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Vertex AI,GCP Spanner,GCP Dataprep,GCP Datastream,Google Analytics Hub,GCP Dataform,GCP Dataplex/Catalog,GCP Cloud Datastore/Firestore,GCP Datafusion,GCP Pub/Sub,GCP Cloud SQL,GCP Cloud Composer,Google Looker,GCP Cloud Datastore,GCP Data Architecture,Google Cloud IAM,GCP Bigtable,GCP Looker1,GCP Data Flow,GCP Cloud Pub/Sub"

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

You should have 6-10 years of experience in development, specifically in Java/J2EE, with a strong knowledge of core Java. Additionally, you must be proficient in Spring frameworks, particularly in Spring MVC, Spring Boot, and JPA + Hibernate. It is essential to have hands-on experience with Microservice technology, including development of RESTFUL and SOAP Web Services. A good understanding of Oracle DB is required. Your communication skills, especially when interacting with clients, should be excellent. Experience in building tools like Maven, deployment, and troubleshooting issues is necessary. Knowledge of CI/CD tools such as Jenkins and experience with GIT or similar source control tools is expected. You should also be familiar with Agile/Scrum software development methodologies using tools like Jira, Confluence, and BitBucket and have performed Requirement Analysis. It would be beneficial to have knowledge of frontend stacks like React or Angular, as well as frontend and backend API integration. Experience with AWS, CI/CD best practices, and designing security reference architectures for AWS Infrastructure Applications is advantageous. You should possess good verbal and written communication skills, the ability to multitask in a fast-paced environment, and be highly organized and detail-oriented. Awareness of common information security principles and practices is required. TELUS International is committed to creating a diverse and inclusive workplace and is an equal opportunity employer. All employment decisions are based on qualifications, merits, competence, and performance without regard to any characteristic related to diversity.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelors degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you&aposll enjoy your career with us! Not the right fit Let us know you&aposre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies