Jobs
Interviews

23 Cloudsql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You should have at least 6+ years of experience in Java, Springboot, Microservices, ReactJS, product development, and sustenance. Troubleshooting and debugging existing code will be part of your responsibilities. It is essential to be proficient in code quality, security compliance, and application performance management. Your role will also involve participation in the agile planning process and estimation of planned tasks. Good verbal and written communication skills are necessary, along with expertise in unit testing (Junit). As part of your key responsibilities and deliverables, you will be responsible for feature implementation and delivering production-ready code. Technical documentation and system diagrams, debugging reports, and fixes, as well as performance optimizations, will also be expected from you. Qualifications and Experience: - 6+ years of experience in developing and designing software applications using Java - Expert understanding of core computer science fundamentals such as data structures, algorithms, and concurrent programming - Experience in analyzing, designing, implementing, and troubleshooting software solutions for highly transactional systems - Proficiency in OOAD and design principles, implementing microservices architecture using various technologies including JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow - Experience working in Native & Hybrid Cloud environments - Familiarity with Agile development methodology - Strong collaboration and communication skills to work effectively across product and technology teams - Ability to translate strategic priorities into scalable and user-centric solutions - Detail-oriented problem solver with excellent communication skills and a can-do attitude - Experience with Java, Java IDEs like Eclipse or IntelliJ, Java EE application servers, object-oriented design, Git, Maven, scripting languages, JSON, XML, YAML, Terraform, etc. Preferred Skills/Experience: - Experience with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies - Deep knowledge of creating secure solutions by design, multi-threaded backend environments, and tools/languages like Ruby, Python, Perl, Node.js, bash scripting languages, Spring, Spring Boot, C, C++, Docker, Kubernetes, Oracle, etc. Working with GlobalLogic offers a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. You'll have the chance to collaborate with innovative clients and work on cutting-edge solutions that shape the world today. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating impactful digital products and experiences, collaborating with clients to transform businesses through intelligent products and services.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Java Developer with over 6 years of experience in Java, Springboot, Microservices, and ReactJS, you will be responsible for troubleshooting and debugging existing code as necessary. Your proficiency in ensuring code quality, security compliance, and application performance management will be crucial to the success of the projects. You will actively participate in the agile planning process and estimate planned tasks while possessing good verbal and written communication skills. Additionally, your expertise in unit testing, particularly with Junit, will be essential in ensuring the overall quality of the software. Your key responsibilities will include feature implementation and delivering production-ready code, along with creating technical documentation and system diagrams. You will also be tasked with generating debugging reports, implementing fixes, and optimizing performance to enhance the overall efficiency of the systems. To excel in this role, you should have a solid foundation in core computer science fundamentals, including data structures, algorithms, and concurrent programming. Your experience should demonstrate a deep understanding of software design principles, microservices architecture, and various technologies such as JEE, Spring, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, and Data Flow. Experience with Native & Hybrid Cloud environments, Agile development methodologies, and proficiency in programming languages like Python and Java will be beneficial. You are expected to collaborate effectively with the product and technology teams, translating strategic priorities into scalable and user-centric solutions. Your attention to detail and problem-solving skills will be critical in addressing complex issues and delivering effective solutions. Strong communication skills and a proactive, team-oriented attitude are essential for success in this role. Preferred skills and experiences include familiarity with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies, and creating secure solutions by design. Experience with multi-threaded backend environments, Docker, Kubernetes, and scripting languages like Ruby, Python, Perl, Node.js, and bash will be advantageous. At GlobalLogic, we value a culture of caring, continuous learning and development, meaningful work, balance, flexibility, and integrity. As part of our team, you will have the opportunity to work on impactful projects, grow personally and professionally, and collaborate with forward-thinking clients on cutting-edge solutions that shape the world today. Join us and be a part of our commitment to engineering impact and transforming businesses through intelligent digital products and services.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled Data Engineer, you will leverage your expertise to contribute to the development of data modeling, ETL processes, and reporting systems. With over 3 years of hands-on experience in areas such as ETL, Big Query, SQL, Python, or Alteryx, you will play a crucial role in enhancing data engineering processes. Your advanced knowledge of SQL programming and database management will be key in ensuring the efficiency of data operations. In this role, you will utilize your solid experience with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau to create insightful reports and analytics. Your understanding of data warehousing concepts and best practices will enable you to design robust data solutions. Your problem-solving skills and attention to detail will be instrumental in addressing data quality issues and proposing effective BI solutions. Collaboration and communication are essential aspects of this role, as you will work closely with stakeholders to define requirements and develop data-driven insights. Your ability to work both independently and as part of a team will be crucial in ensuring the successful delivery of projects. Additionally, your proactive approach to learning new tools and techniques will help you stay ahead in a dynamic environment. Preferred skills include experience with GCP cloud services, Python, Hive, Spark, Scala, JavaScript, and various BI/reporting tools. Your strong oral, written, and interpersonal communication skills will enable you to effectively convey insights and solutions to stakeholders. A Bachelor's degree in Computer Science, Computer Information Systems, or a related field is required for this role. Overall, as a Data Engineer, you will play a vital role in developing and maintaining data pipelines, reporting systems, and dashboards. Your expertise in SQL, BI tools, and data validation will contribute to ensuring data accuracy and integrity across all systems. Your analytical mindset and ability to perform root cause analysis will be key in identifying opportunities for improvement and driving data-driven decision-making within the organization.,

Posted 3 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 3 years of hands-on experience in data modeling, ETL processes, developing reporting systems, and data engineering using tools such as ETL, Big Query, SQL, Python, or Alteryx. Additionally, you should possess advanced knowledge in SQL programming and database management. Moreover, you must have a minimum of 3 years of solid experience working with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau, along with a good understanding of data warehousing concepts and best practices. Excellent problem-solving and analytical skills are essential for this role, as well as being detail-oriented with strong communication and collaboration skills. The ability to work both independently and as part of a team is crucial for success in this position. Preferred skills include experience with GCP cloud services such as BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker, Looker ML, Data Studio, and GCP QlikSense. Strong SQL skills and proficiency in various BI/Reporting tools to build self-serve reports, analytic dashboards, and ad-hoc packages leveraging enterprise data warehouses are also desired. Moreover, having at least 1 year of experience in Python and Hive/Spark/Scala/JavaScript is preferred. Additionally, you should have a solid understanding of consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations, and development delivery experience. Furthermore, it is important to have a good grasp of BI tools, architectures, and visualization solutions, coupled with an inquisitive and proactive approach to learning new tools and techniques. Strong oral, written, and interpersonal communication skills are necessary, and you should be comfortable working in a dynamic environment where problems are not always well-defined.,

Posted 4 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Job Title: Data Modeller Experience: 6+ Years Location: Bangalore Work Mode: Onsite Job Role: We are seeking a skilled Data Modeller with expertise in designing data models for both OLTP and OLAP systems. The ideal candidate will have deep knowledge of data modelling principles and a strong understanding of database performance optimization, especially in near-real-time reporting environments. Prior experience with GCP databases and data modelling tools is essential. Responsibilities: ? Design and implement data models (Conceptual, Logical, and Physical) for complex business requirements ? Develop scalable OLTP and OLAP models to support enterprise data needs ? Optimize database performance through effective indexing, partitioning, and data sharding techniques ? Work closely with development and analytics teams to ensure alignment of models with application and reporting needs ? Use data modelling tools like Erwin, DBSchema, or similar to create and maintain models ? Implement best practices for data quality, governance, and consistency across systems ? Leverage GCP database solutions such as AlloyDB, CloudSQL, and BigQuery ? Collaborate with business stakeholders, especially within the mutual fund domain (preferred), to understand data requirements Requirements: ? 6+ years of hands-on experience in data modelling for OLTP and OLAP systems ? Strong command over data modelling fundamentals (Conceptual, Logical, Physical) ? In-depth knowledge of indexing, partitioning, and data sharding strategies ? Experience with real-time and near-real-time reporting systems ? Proficiency in data modelling tools preferably DBSchema or Erwin ? Familiarity with GCP databases like AlloyDB, CloudSQL, and BigQuery ? Functional understanding of the mutual fund industry is a plus ? Must be willing to work from Chennai office presence is mandatory Technical Skills: Data Modelling (Conceptual, Logical, Physical), OLTP, OLAP, Indexing, Partitioning, Data Sharding, Database Performance Tuning, Real-Time/Near-Real-Time Reporting, DBSchema, Erwin, AlloyDB, CloudSQL, BigQuery.

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have proficiency in GCP, Data Modelling (OLTP, OLAP), indexing, DBSchema, CloudSQL, and BigQuery. As a Data Modeller, you will be responsible for hands-on data modelling for OLTP and OLAP systems. Your role will require an in-depth understanding of Conceptual, Logical, and Physical data modelling. It is essential to have a strong grasp of Indexing, partitioning, and data sharding, supported by practical experience in these areas. Moreover, you must possess a solid understanding of the variables that impact database performance for near-real-time reporting and application interaction. Experience with at least one data modelling tool, preferably DBSchema, is necessary. Individuals with functional knowledge of the mutual fund industry will be preferred for this role. Additionally, a good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery would be beneficial for your responsibilities.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeling Engineer specializing in Near Real-time Reporting, you will be responsible for creating robust and optimized schemas to facilitate near real-time data flows for operational and analytical purposes within Google Cloud environments. Your primary focus will be on designing models that ensure agility, speed, and scalability to support high-throughput, low-latency data access needs. Your key responsibilities will include designing data models that align with streaming pipelines, developing logical and physical models tailored for near real-time reporting, implementing strategies such as caching, indexing, and materialized views to enhance performance, and ensuring data integrity, consistency, and schema quality during rapid changes. To excel in this role, you must possess experience in building data models for real-time or near real-time reporting systems, hands-on expertise with GCP platforms such as BigQuery, CloudSQL, and AlloyDB, and a solid understanding of pub/sub, streaming ingestion frameworks, and event-driven design. Additionally, proficiency in indexing strategies and adapting schemas in high-velocity environments is crucial. Preferred skills for this position include exposure to monitoring, alerting, and observability tools, as well as functional familiarity with financial reporting workflows. Moreover, soft skills like proactive adaptability in fast-paced data environments, effective verbal and written communication, and a collaborative, solution-focused mindset will be highly valued. By joining our team, you will have the opportunity to design the foundational schema for mission-critical real-time systems, contribute to the performance and reliability of enterprise data workflows, and be part of a dynamic GCP-focused engineering team. Skills required for this role include streaming ingestion frameworks, BigQuery, reporting, modeling, AlloyDB, pub/sub, CloudSQL, Google Cloud Platform (GCP), data management, real-time reporting, indexing strategies, and event-driven design.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeller with 6-9 years of experience, you will be responsible for hands-on data modelling for both OLTP and OLAP systems. Your role will involve having an in-depth knowledge of Conceptual, Logical, and Physical data modelling, along with a strong understanding of indexing, partitioning, and data sharding. Practical experience in these areas is essential. You will need to possess a strong understanding of variables that impact database performance, specifically for near-real-time reporting and application interaction. It is expected that you have working experience with at least one data modelling tool, with a preference for DBSchema. Additionally, individuals with functional knowledge of the mutual fund industry will be considered a plus. Having a good understanding of GCP databases such as AlloyDB, CloudSQL, and BigQuery is necessary for this role. This position is full-time and requires you to work from the client's office in Chennai. Benefits include health insurance and the opportunity to work from home. The work schedule is during the day shift, with the work location being in person. If you meet the above requirements and are looking to contribute your expertise in data modelling within a dynamic environment, we encourage you to apply for this position.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a Database Performance & Data Modeling Specialist with a primary focus on optimizing schema structures, tuning SQL queries, and ensuring that data models are well-prepared for high-volume, real-time systems. Your responsibilities include designing data models that balance performance, flexibility, and scalability, conducting performance benchmarking to identify bottlenecks and propose improvements, analyzing slow queries to recommend indexing, denormalization, or schema revisions, monitoring query plans, memory usage, and caching strategies for cloud databases, and collaborating with developers and analysts to optimize application-to-database workflows. You must possess strong experience in database performance tuning, especially in GCP platforms like BigQuery, CloudSQL, and AlloyDB. Proficiency in schema refactoring, partitioning, clustering, and sharding techniques is essential. Familiarity with profiling tools, slow query logs, and GCP monitoring solutions is required, along with SQL optimization skills including query rewriting and execution plan analysis. Preferred skills include a background in mutual fund or high-frequency financial data modeling, hands-on experience with relational databases like PostgreSQL, MySQL, distributed caching, materialized views, and hybrid model structures. Soft skills that are crucial for this role include being precision-driven with an analytical mindset, a clear communicator with attention to detail, and possessing strong problem-solving and troubleshooting abilities. By joining this role, you will have the opportunity to shape high-performance data systems from the ground up, play a critical role in system scalability and responsiveness, and work with high-volume data in a cloud-native enterprise setting.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud Data Modeler with 6 to 9 years of experience in GCP environments, you will play a crucial role in designing schema architecture, creating performance-efficient data models, and guiding teams on cloud-based data integration best practices. Your expertise will be focused on GCP data platforms such as BigQuery, CloudSQL, and AlloyDB. Your responsibilities will include architecting and implementing scalable data models for cloud data warehouses and databases, optimizing OLTP/OLAP systems for reporting and analytics, supporting cloud data lake and warehouse architecture, and reviewing and optimizing existing schemas for cost and performance on GCP. You will also be responsible for defining documentation standards, ensuring model version tracking, and collaborating with DevOps and DataOps teams for deployment consistency. Key Requirements: - Deep knowledge of GCP data platforms including BigQuery, CloudSQL, and AlloyDB - Expertise in data modeling, normalization, and dimensional modeling - Understanding of distributed query engines, table partitioning, and clustering - Familiarity with DBSchema or similar tools Preferred Skills: - Prior experience in BFSI or asset management industries - Working experience with Data Catalogs, lineage, and governance tools Soft Skills: - Collaborative and consultative mindset - Strong communication and requirements gathering skills - Organized and methodical approach to data architecture challenges By joining our team, you will have the opportunity to contribute to modern data architecture in a cloud-first enterprise, influence critical decisions around GCP-based data infrastructure, and be part of a future-ready data strategy implementation team.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Data Modeler with expertise in using DBSchema within GCP environments. In this role, you will be responsible for creating and optimizing data models for both OLTP and OLAP systems, ensuring they are well-designed for performance and maintainability. Your key responsibilities will include developing conceptual, logical, and physical models using DBSchema, aligning schema design with application requirements, and optimizing models in BigQuery, CloudSQL, and AlloyDB. Additionally, you will be involved in supporting schema documentation, reverse engineering, and visualization tasks. Your must-have skills for this role include proficiency in using the DBSchema modeling tool, strong experience with GCP databases such as BigQuery, CloudSQL, and AlloyDB, as well as knowledge of OLTP and OLAP system structures and performance tuning. It is essential to have expertise in SQL and schema evolution/versioning best practices. Preferred skills include experience integrating DBSchema with CI/CD pipelines and knowledge of real-time ingestion pipelines and federated schema design. As a Data Modeler, you should possess soft skills such as being detail-oriented, organized, and communicative. You should also feel comfortable presenting schema designs to cross-functional teams. By joining this role, you will have the opportunity to work with industry-leading tools in modern GCP environments, enhance modeling workflows, and contribute to enterprise data architecture with visibility and impact.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a versatile Data Model Developer with 6 to 9 years of experience, proficient in designing robust data models across cloud (GCP) and traditional RDBMS environments. Your role involves collaborating with cross-functional teams to develop schemas that cater to both operational systems and analytical use cases. Your key responsibilities include designing and implementing scalable data models for cloud (GCP) and traditional RDBMS, supporting hybrid data architectures integrating real-time and batch workflows, collaborating with engineering teams for seamless schema implementation, documenting conceptual, logical, and physical models, assisting in ETL and data pipeline alignment with schema definitions, and monitoring and refining performance through partitioning and indexing strategies. You must have experience with GCP data services like BigQuery, CloudSQL, AlloyDB, proficiency in relational databases such as PostgreSQL, MySQL, or Oracle, solid grounding in OLTP/OLAP modeling principles, familiarity with schema design tools like DBSchema, ER/Studio, and SQL expertise for query performance optimization. Preferred skills include experience working in hybrid cloud/on-prem data architectures, functional knowledge in BFSI or asset management domains, and knowledge of metadata management and schema versioning. Soft skills required for this role include adaptability to cloud and legacy tech stacks, clear communication with engineers and analysts, and strong documentation and collaboration skills. Joining this role will allow you to contribute to dual-mode data architecture (cloud + on-prem), solve real-world data design challenges in regulated industries, and have the opportunity to influence platform migration and modernization.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architect specializing in OLTP & OLAP Systems, you will play a crucial role in designing, optimizing, and governing data models for both OLTP and OLAP environments. Your responsibilities will include architecting end-to-end data models across different layers, defining conceptual, logical, and physical data models, and collaborating closely with stakeholders to capture functional and performance requirements. You will need to optimize database structures for real-time and analytical workloads, enforce data governance, security, and compliance best practices, and enable schema versioning, lineage tracking, and change control. Additionally, you will review query plans and indexing strategies to enhance performance. To excel in this role, you must possess a deep understanding of OLTP and OLAP systems architecture, along with proven experience in GCP databases such as BigQuery, CloudSQL, and AlloyDB. Your expertise in database tuning, indexing, sharding, and normalization/denormalization will be critical, as well as proficiency in data modeling tools like DBSchema, ERWin, or equivalent. Familiarity with schema evolution, partitioning, and metadata management is also required. Experience in the BFSI or mutual fund domain, knowledge of near real-time reporting and streaming analytics architectures, and familiarity with CI/CD for database model deployments are preferred skills that will set you apart. Strong communication, stakeholder management, strategic thinking, and the ability to mentor data modelers and engineers are essential soft skills for success in this position. By joining our team, you will have the opportunity to own the core data architecture for a cloud-first enterprise, bridge business goals with robust data design, and work with modern data platforms and tools. If you are looking to make a significant impact in the field of data architecture, this role is perfect for you.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Functional Data Modeler in the Mutual Fund industry, you will play a vital role in designing data models that accurately represent fund structures, NAV calculations, asset allocation, and compliance workflows. Your expertise in data modeling, combined with a deep understanding of mutual funds and BFSI domains, will be instrumental in creating schemas that meet operational and regulatory requirements. Your responsibilities will include collaborating with business analysts and product teams to translate functional requirements into effective data structures. It will be crucial for you to ensure that the data models you design comply with data privacy regulations, regulatory reporting standards, and audit requirements. Additionally, you will be responsible for building OLTP and OLAP data models to support real-time and aggregated reporting needs and documenting metadata, lineage, and data dictionaries for business use. To excel in this role, you must have a strong domain expertise in Mutual Fund/BFSI operations and a proven track record in data modeling for financial and regulatory systems. Proficiency in schema design on GCP platforms such as BigQuery and CloudSQL, as well as hands-on experience with modeling tools like DBSchema or ER/Studio, are essential skills required for this position. Preferred skills for this role include experience working with fund management platforms or reconciliation engines and familiarity with financial compliance standards such as SEBI and FATCA. Soft skills like strong business acumen and effective documentation capabilities will also be valuable in liaising between functional and technical teams successfully. By joining our team, you will have the opportunity to own critical financial data architecture, influence domain-driven modeling for financial ecosystems, and be part of a fast-paced data transformation journey in the BFSI sector. If you are looking to make a significant impact in the field of data modeling within the mutual fund industry, this role is perfect for you.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are looking for a Data Modelling Consultant with 6 to 9 years of experience to work in Chennai office. As a Data Modelling Consultant, your role will involve providing end-to-end modeling support for OLTP and OLAP systems hosted on Google Cloud. Your responsibilities will include designing and validating conceptual, logical, and physical models for cloud databases, translating requirements into efficient schema designs, and supporting data model reviews, tuning, and implementation. You will also guide teams on best practices for schema evolution, indexing, and governance to enable usage of models in real-time applications and analytics platforms. To succeed in this role, you must have strong experience in modeling across OLTP and OLAP systems, hands-on experience with GCP tools like BigQuery, CloudSQL, and AlloyDB, and the ability to understand business rules and translate them into scalable structures. Additionally, familiarity with partitioning, sharding, materialized views, and query optimization is essential. Preferred skills for this role include experience with BFSI or financial domain data schemas, familiarity with modeling methodologies and standards such as 3NF and star schema. Soft skills like excellent stakeholder communication, collaboration, strategic thinking, and attention to scalability are also important. Joining this role will allow you to deliver advisory value across critical data initiatives, influence the modeling direction for a data-driven organization, and be at the forefront of GCP-based enterprise data transformation.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Modeller specializing in GCP and Cloud Databases, you will play a crucial role in designing and optimizing data models for both OLTP and OLAP systems. Your expertise in cloud-based databases, data architecture, and modeling will be essential in collaborating with engineering and analytics teams to ensure efficient operational systems and real-time reporting pipelines. You will be responsible for designing conceptual, logical, and physical data models tailored for OLTP and OLAP systems. Your focus will be on developing and refining models that support performance-optimized cloud data pipelines, implementing models in BigQuery, CloudSQL, and AlloyDB, as well as designing schemas with indexing, partitioning, and data sharding strategies. Translating business requirements into scalable data architecture and schemas will be a key aspect of your role, along with optimizing for near real-time ingestion, transformation, and query performance. You will utilize tools like DBSchema for collaborative modeling and documentation while creating and maintaining metadata and documentation around models. In terms of required skills, hands-on experience with GCP databases (BigQuery, CloudSQL, AlloyDB), a strong understanding of OLTP and OLAP systems, and proficiency in database performance tuning are essential. Additionally, familiarity with modeling tools such as DBSchema or ERWin, as well as a proficiency in SQL, schema definition, and normalization/denormalization techniques, will be beneficial. Preferred skills include functional knowledge of the Mutual Fund or BFSI domain, experience integrating with cloud-native ETL and data orchestration pipelines, and familiarity with schema version control and CI/CD in a data context. In addition to technical skills, soft skills such as strong analytical and communication abilities, attention to detail, and a collaborative approach across engineering, product, and analytics teams are highly valued. Joining this role will provide you with the opportunity to work on enterprise-scale cloud data architectures, drive performance-oriented data modeling for advanced analytics, and collaborate with high-performing cloud-native data teams.,

Posted 1 month ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What well offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities DesigningandimplementationofGCPinfrastructureusingIaC Automationofrecurringprocesses WorkingcloselywithdevelopmentteamsandimplementingCI/CDpipelinesforbuilding,testing and deployingapplications Containerizingapplicationsandorchestratingcontainers Designingandimplementationofapplicationenvironmentstoeasedevelopment,testing,and Release processes Monitoringtheinfrastructureandapplicationforimprovements Maintainingandupgradingcurrentprocesses Cost-cuttinganalysis Your skills and experience Experienceinworkingwith GoogleCloudPlatform Experienceincontainerizationandorchestration (Docker,GKE,ArtifactRegistry,CloudRun,CloudSQL) ExperiencewithIaC(Terraform) ExperienceinwritingCI/CDforapplicationsandinfrastructure(GitHubworkflows,Jenkins,etc.) Experienceinusingmonitoringtools(CloudMonitoring) Knowledgeofatleastonescriptinglanguage BasicDevSecOpsskills Experienceinuserandpermissionsmanagement(IAM) 3-5 Years of Experience as DevOps Engineer CertificationinGCPPreffered How well support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 2 months ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Pune

Work from Office

We are hiring a DevOps / Site Reliability Engineer for a 6-month full-time onsite role in Pune (with possible extension). The ideal candidate will have 69 years of experience in DevOps/SRE roles with deep expertise in Kubernetes (preferably GKE), Terraform, Helm, and GitOps tools like ArgoCD or Flux. The role involves building and managing cloud-native infrastructure, CI/CD pipelines, and observability systems, while ensuring performance, scalability, and resilience. Experience in infrastructure coding, backend optimization (Node.js, Django, Java, Go), and cloud architecture (IAM, VPC, CloudSQL, Secrets) is essential. Strong communication and hands-on technical ability are musts. Immediate joiners only.

Posted 2 months ago

Apply

6.0 - 9.0 years

18 - 20 Lacs

Pune

Work from Office

Timings: Full Time (As per company timings) Notice Period: (Immediate Joiner - Only) Duration: 6 Months (Possible Extension) Shift Timing: 11:30 AM 9:30 PM IST About the Role We are looking for a highly skilled and experienced DevOps / Site Reliability Engineer to join on a contract basis. The ideal candidate will be hands-on with Kubernetes (preferably GKE), Infrastructure as Code (Terraform/Helm), and cloud-based deployment pipelines. This role demands deep system understanding, proactive monitoring, and infrastructure optimization skills. Key Responsibilities: Design and implement resilient deployment strategies (Blue-Green, Canary, GitOps). Configure and maintain observability tools (logs, metrics, traces, alerts). Optimize backend service performance through code and infra reviews (Node.js, Django, Go, Java). Tune and troubleshoot GKE workloads, HPA configs, ingress setups, and node pools. Build and manage Terraform modules for infrastructure (VPC, CloudSQL, Pub/Sub, Secrets). Lead or participate in incident response and root cause analysis using logs, traces, and dashboards. Reduce configuration drift and standardize secrets, tagging, and infra consistency across environments. Collaborate with engineering teams to enhance CI/CD pipelines and rollout practices. Required Skills & Experience: 510 years in DevOps, SRE, Platform, or Backend Infrastructure roles. Strong coding/scripting skills and ability to review production-grade backend code. Hands-on experience with Kubernetes in production, preferably on GKE. Proficient in Terraform, Helm, GitHub Actions, and GitOps tools (ArgoCD or Flux). Deep knowledge of Cloud architecture (IAM, VPCs, Workload Identity, CloudSQL, Secret Management). Systems thinking understands failure domains, cascading issues, timeout limits, and recovery strategies. Strong communication and documentation skills capable of driving improvements through PRs and design reviews. Tech Stack & Tools Cloud & Orchestration: GKE, Kubernetes IaC & CI/CD: Terraform, Helm, GitHub Actions, ArgoCD/Flux Monitoring & Alerting: Datadog, PagerDuty Databases & Networking: CloudSQL, Cloudflare Security & Access Control: Secret Management, IAM Driving Results: A good single contributor and a good team player. Flexible attitude towards work, as per the needs. Proactively identify & communicate issues and risks. Other Personal Characteristics: Dynamic, engaging, self-reliant developer. Ability to deal with ambiguity. Manage a collaborative and analytical approach. Self-confident and humble. Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people

Posted 2 months ago

Apply

6.0 - 9.0 years

18 - 20 Lacs

Pune

Work from Office

Notice Period: (Immediate Joiner - Only) Duration: 6 Months (Possible Extension) Shift Timing: 11:30 AM 9:30 PM IST About the Role We are looking for a highly skilled and experienced DevOps / Site Reliability Engineer to join on a contract basis. The ideal candidate will be hands-on with Kubernetes (preferably GKE), Infrastructure as Code (Terraform/Helm), and cloud-based deployment pipelines. This role demands deep system understanding, proactive monitoring, and infrastructure optimization skills. Key Responsibilities: Design and implement resilient deployment strategies (Blue-Green, Canary, GitOps). Configure and maintain observability tools (logs, metrics, traces, alerts). Optimize backend service performance through code and infra reviews (Node.js, Django, Go, Java). Tune and troubleshoot GKE workloads, HPA configs, ingress setups, and node pools. Build and manage Terraform modules for infrastructure (VPC, CloudSQL, Pub/Sub, Secrets). Lead or participate in incident response and root cause analysis using logs, traces, and dashboards. Reduce configuration drift and standardize secrets, tagging, and infra consistency across environments. Collaborate with engineering teams to enhance CI/CD pipelines and rollout practices. Required Skills & Experience: 5-10 years in DevOps, SRE, Platform, or Backend Infrastructure roles. Strong coding/scripting skills and ability to review production-grade backend code. Hands-on experience with Kubernetes in production, preferably on GKE. Proficient in Terraform, Helm, GitHub Actions, and GitOps tools (ArgoCD or Flux). Deep knowledge of Cloud architecture (IAM, VPCs, Workload Identity, CloudSQL, Secret Management). Systems thinking understands failure domains, cascading issues, timeout limits, and recovery strategies. Strong communication and documentation skills capable of driving improvements through PRs and design reviews. Tech Stack & Tools Cloud & Orchestration: GKE, Kubernetes IaC & CI/CD: Terraform, Helm, GitHub Actions, ArgoCD/Flux Monitoring & Alerting: Datadog, PagerDuty Databases & Networking: CloudSQL, Cloudflare Security & Access Control: Secret Management, IAM Driving Results: A good single contributor and a good team player. Flexible attitude towards work, as per the needs. Proactively identify & communicate issues and risks. Other Personal Characteristics: Dynamic, engaging, self-reliant developer. Ability to deal with ambiguity. Manage a collaborative and analytical approach. Self-confident and humble. Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people

Posted 2 months ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 09 The Team: Automotive Mastermind was founded on the idea that there are patterns in peoples behavior that, with the right logic, can be used to predict future outcomes.Our software helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale.Our culture is creative and entrepreneurial where everyone contributes to company goals in a very real way. We are a hardworking group, but we have a lot of fun with what we do and are looking for new people with a similar mindset to join the organization. The Impact: As a Quality Engineer you will collaborate with members of both, Product and Development Teams to help them make informed decisions on releases of one of the best tools there is for car dealerships in the United States. Whats in it for you: Possibility to work on a project in a very interesting domain - Automotive industry in the United States, and influence the quality of one of the best tools there is for car dealerships. Affect processes and tools used for Quality Engineering. Our Team has a high degree of autonomy in automotive Mastermind organization to decide what tools and processes we will use. Responsibilities: Own and be responsible for testing and delivery of product or core modules. Assessing the quality, usability and functionality of each release. Reviewing software requirement and capable in preparing test scenarios for complex business rules Interact with the stakeholders to understand the detailed requirements and expectations Be able to gain technical knowledge and aim to be a quality SME(s) in core functional components Developing and organizing QA Processes for assigned projects to align with overall QA goals Designing and implementing a test automation strategy supporting multiple product development teams Leading efforts for related automation projects, design and code reviews Producing regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audiences. What Were Looking For: Participate in and improve the whole lifecycle of servicesfrom inception and design, through deployment, operation, and refinement. Participate in the release planning process to review functional specifications and create release plans. Collaborate with software engineers to design verification test plans. Design regression test suites and review with engineering, applications and the field organization. Produce regular reports on the status and quality of software releases and be prepared to speak to findings in an informative way to all levels of audience. Assess the quality, usability and functionality of each release. Develop and organize QA Processes for assigned projects to align with overall QA goals Lead and train a dynamically changing team of colleagues who participate in testing processes Exhibit expertise in handling large scale programs/projects that involve multiple stakeholders (Product, Dev, DevOps) Maintain a leading edge understanding of QA as related to interactive technologies best practices Design and implement test automation strategy for multiple product development teams at the onset of the project. Lead efforts for related automation projects, design and code reviews. Work closely with leadership and IT to provide input into the design and implementation of the automation framework. Work with Architecture, Engineering, Quality Engineering, IT, and Product Operations leaders to create and implement processes that accelerate the delivery of new features and products with high quality and at scale. Develop and contribute to a culture of high performance, transparency and continuous improvement as it relates to the infrastructure services and streamlining of the development pipeline. Participate in a diverse team of talented engineers globally, providing guidance, support and clear priorities. ? Who you are: Total Experience: 2 to 6 years. Hands on experience with at least 2 or more of leading testing tools/framework like Playwright, Robot Framework, K6, Jmeter. Hands on experience working on Python. Experience with Databases SQL/NoSQL. Experience working on CloudNative Applications. Hands on experience with Google Cloud Services like Kubernetes, Composer, Dataplex, Pub-Sub, BigQuery, AlloyDb, CloudSQL , lookerstudio etc. Strong analytical skills and ability to solve complex technical problems. API testing - must have understanding of RESTful design / best practices. Hands on experience testing APIs and test tools Experience with load / stress / performance testing and tools, Experience with Azure DevOps (or other similar issue/bug tracking systems) is required, Experience working with Cloud native applications. Ability to think abstract to ensure ability to not conform to the norm. Norms do not find bugs quickly, Experience working in an Agile software development organization, Experience supporting development and product teams Excellent verbal, written, and interpersonal communication skills; ability to interact with all levels of an organization Ability to work in an advisory capacity to identify key technical and business problems, develop and evaluate. Grade: 08 / 09 Job Location: Gurugram Hybrid Mode: twice a week work from office. Shift Time: 12 pm to 9 pm IST.

Posted 2 months ago

Apply

12.0 - 15.0 years

40 - 45 Lacs

Chennai

Work from Office

Skill & Experience Strategic Planning and Direction Maintain architecture principles, guidelines and standards Project & Program Management Data Warehousing Big Data Data Analytics &; Data Science for solutioning Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Experience in designing & mplementing solution in mentioned areas:Strong Google Cloud Platform Data Components BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, Etc

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies