Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
15 - 19 Lacs
pune
Work from Office
Role: GCP Architect Experience: 8+ Years Location: Pune Work Mode: 5 Days Working PF is Mandatory Detail JD: GCP Architect (API Architecture and GCP)- 8 Years of experience (Proven experience designing and deploying applications on Google Cloud Platform (GCP). Familiarity with GCP services such as GKE, Cloud Run, Cloud Functions, Pub/Sub, and BigQuery. Demonstrated ability to lead technical discussions, make architectural decisions, and mentor teams Experience of Private Banking is relevant to the role. Experience architecting application solutions with medium to large application integrations involved. Significant experience of Architecture ways of working, for example Reviews, Design Authority processes Ability to thrive in a fast-paced, collaborative environment Excellent written and verbal communication skills, with the ability to adapt communications appropriately to the audience GCP Certifications)
Posted 1 hour ago
3.0 - 8.0 years
1 - 2 Lacs
bengaluru
Work from Office
Overview: TekWissen is a global workforce management provider that offers strategic talent solutions to our clients throughout India and world-wide. Our client is a company operating a marketplace for consumers, sellers, and content creators. It offers merchandise and content purchased for resale from vendors and those offered by thirdparty sellers. Job Title: Mobile application Developer Location: Bengaluru Job Type: Contract Work Type: Onsite Notice Period: Immediate to 15 days Job Description: Experience: Overall: 2 - 4 Years Immediate Joiners only Mandatory Skills: Swift (IOS), Kotlin and Java (Android), LLD, HLD Required Skills: Experience of iOS development using Swift and Xcode. Experience of Android development using Kotlin/Java and Android Studio. Experience with mobile app architecture patterns such as MVVM or MVC. Experience working with REST APIs, JSON, and third-party libraries. Strong problem-solving skills and ability to write clean, reusable code. Good understanding of mobile app lifecycle, UI components, and navigation patterns. Preferred Skills (Good to Have): Exposure to Firebase, Push Notifications, or Cloud Functions. Experience of SwiftUI, Jetpack Components, or modern Android libraries. Experience with version control systems like Git. Understanding of the Agile development process. TekWissen Group is an equal opportunity employer supporting workforce diversity.
Posted 23 hours ago
10.0 - 12.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Overview We are looking for 10+ years of experience in AI Cloud Architect + developer who can design, develop, and deploy cloud-based solutions for our clients. You will be responsible for creating scalable, secure, and cost-effective cloud architectures that meet the business and technical requirements of the projects. You will also be involved in coding, testing, debugging, and troubleshooting cloud applications using various cloud services and tools. You will work closely with other developers, architects, and project managers to deliver high-quality cloud solutions that meet the client&aposs expectations and deadlines. Responsibilities Design and develop cloud architectures and applications using AWS/ Azure/Google Cloud, or other cloud platforms. Implement best practices and standards for cloud development, security, performance, and reliability. Integrate cloud solutions with existing systems and applications using APIs, microservices, and other methods. Optimize cloud resources and costs using automation, monitoring, and analytics tools. Document and communicate cloud architectures and applications using diagrams, reports, and presentations. Research and evaluate new cloud technologies and trends to improve the existing cloud solutions and create new opportunities. Qualifications Bachelor&aposs degree in computer science, engineering, or related field. At least 5-10 years of experience in cloud architecture and development using AWS/Azure/Google Cloud, or other cloud platforms. Experience in AI/ML services offered by GCP, Azure, and AWS . Proficient in one or more programming languages such as Python, Java, C#, Node.js, etc. Knowledge of cloud services and tools such as EC2, S3, Lambda, CloudFormation, Azure Functions, App Service, Storage, Google Compute Engine, Cloud Storage, Cloud Functions, etc. Experience in cloud security, performance, and reliability concepts and techniques. Experience in cloud integration, migration, and deployment using DevOps tools and methodologies. Strong understanding of cloud security, networking, storage, and database services. Proficiency in cloud architecture frameworks and best practices. Strong knowledge of containerization ( Docker, Kubernetes ) and serverless computing. Expertise in automation tools like Terraform, Ansible, CloudFormation , etc. Familiarity with DevOps practices and tools such as Jenkins, GitLab CI, or Azure DevOps. Excellent communication, collaboration, and problem-solving skills. Certification in AWS, Azure, Google Cloud, or other cloud platforms is a plus. Experience in designing disaster recovery and business continuity plans for cloud environments. Show more Show less
Posted 1 day ago
5.0 - 7.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Who We Are Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Who You Are Our data team serves Zinnia through data engineering, data analysis, and data science. Our goal is to help uncover opportunities and make decisions with data. We partner with all department stakeholders across the company to develop deeper predictors of behavior, develop insights that drive business strategy and build solutions to optimize our internal and external experiences. What Youll Do Overseeing technological choices and implementation of data pipelines and warehousing philosophy Execute and serve as lead and/or SME on cross-organizational and cross-divisional projects automating our data value chain processes Promoting technical best practices throughout the data organization Design data architecture that is simple and maintainable while enabling Data Analysts, Data Scientists, and stakeholders to efficiently work with data. Mentor data team members in architecture and coding techniques. Serve as a source of knowledge for the Data Engineering team for process improvement, automation and new technologies available to enable best-in-class timeliness and data coverage Design data pipelines utilizing ETL tools, event driven software, and other streaming software. Partner with both data scientists and engineers to bring our amazing concepts to reality. This requires learning to speak the language of statisticians as well as software engineers. Ensure reliability in data pipelines and enforce data governance, security and protection of our customers information while balancing tech debt. Demonstrate innovation, customer focus, and experimentation mindsets Partner with product and engineering teams to design data models for downstream data maximization. Evaluate and champion new engineering tools that help us move faster and scale our team What Youll Need A Technical Bachelor/Master&aposs Degree with 5+ years of experience across Data Engineering (Data Pipelining, Warehousing, ETL Tools etc.) Extensive experience with data engineering techniques, Python and using SQL Familiarity and working knowledge of Airflow and dbt You are comfortable and have expertise in data engineering tooling such as Jira, git, buildkite, terraform, airflow, dbt and containers as well as GCP suite, terraform kubernetes, cloud functions You understand standard ETL patterns, modern data warehousing ideas such as data mesh or data vaulting, and data quality practices regarding test driven design and data observability. You enjoy being a high-level architect sometimes, and a low-level coder sometimes You are passionate about all things data: Big data, small data, moving and transforming it, its quality, its accessibility, and delivering value from it to internal and external clients You want ownership to solve for and lead a team to deliver modern and efficient data pipeline components You are passionate about a culture of learning and teaching You love challenging yourself to constantly improve, and sharing your knowledge to empower others You like to take risks when looking for novel solutions to complex problems. If faced with roadblocks, you continue to reach higher to make greatness happen Technologies, you will use: Python for data pipelining and automation. Airbyte for ETL purpose Google Cloud Platform, Terraform, Kubernetes, Cloud SQL, Cloud Functions, BigQuery, DataStore, and more: we keep adopting new tools as we grow! Airflow and dbt for data pipelining Tableau and PowerBI for data visualization and consumer facing dashboards. WHATS IN IT FOR YOU At Zinnia, you collaborate with smart, creative professionals who are dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability. Show more Show less
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a GCP Data Engineer-Technical Lead at Birlasoft Office in Bengaluru, India, you will be responsible for designing, building, and maintaining scalable data pipelines and platforms on Google Cloud Platform (GCP) to support business intelligence, analytics, and machine learning initiatives. With a primary focus on Python and GCP technologies such as BigQuery, Dataproc, and Data Flow, you will develop ETL and ELT pipelines while ensuring optimal data manipulation and performance tuning. Your role will involve leveraging data manipulation libraries like Pandas, NumPy, and PySpark, along with SQL expertise for efficient data processing in BigQuery. Additionally, your experience with tools such as Dataflow, Cloud Run, GKE, and Cloud Functions will be crucial in this position. A strong foundation in data modeling, schema design, data governance, and containerization (Docker) for data workloads will further enhance your contributions to our data team. With 5-8 years of experience in Data Engineering and Software Development, including a minimum of 3-4 years working directly with Google Cloud Platform, you will play a key role in driving our data initiatives forward.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
chandigarh
On-site
You will join ProcureTech, a company that digitally revolutionizes lodging procurement by connecting corporations and suppliers in a cutting-edge ecosystem. This innovative approach ensures seamless efficiency and automation, exceeding the expectations of travelers. TravelTech, another division of the company, redefines the online lodging experience by providing personalized content from selection to check-in, guaranteeing an exceptional journey for corporate travelers. By combining these technology propositions, HRS achieves exponential catalyst effects, delivering value-added services and high-return network effects, ultimately creating substantial customer value. HRS, a company that has experienced exponential growth since 1972, serves over 35% of the global Fortune 500 companies and leading hotel chains. Joining HRS means shaping the future of business travel within a culture of growth and setting new industry standards worldwide. The HRS Business Travel Club (BTC) is the Business Unit responsible for managing and developing the HRS E-Commerce platforms of well-known hotel portal businesses such as HRS.de and HOTEL.DE. BTC aims to provide exclusive services to its members, individual business travelers, and SMEs, enhancing customer loyalty and enabling brand repositioning in the marketplace. Your mission within BTC is to further develop and shape "The HRS Business Travel Club" and its solutions to drive innovation within the HRS Group. As a Chandigarh/Mohali based GCP Data Engineer, your role involves building a next-generation Customer Data Platform that enables customers to find their perfect hotel for every trip while helping hotel partners grow their business. You will work on understanding, implementing, and documenting requirements for the Customer Data Platform (CDP) within HRS Group, collaborating with various teams to ensure its continued development to serve the strategic vision of the group. To excel in this role, you should have a degree in business computer science, mathematics, or a related field, along with a minimum of 3 years of hands-on experience in data engineering or data science. Proficiency in Python, SQL, and the GCP toolset is essential, as is knowledge of cloud technologies such as AWS, GCP, Azure, etc. Additionally, you should possess project management skills, commercial awareness, and ideally, experience with analytical CRM, CDPs, Marketing Automation, or integrated Sales & Marketing setups. Fluency in English is a requirement for this position. In return for your contributions, you will receive an attractive remuneration package that is competitive in the market. This package includes a fixed monthly salary, necessary work equipment, mobility options, and an annual or multi-year bonus. Join us at HRS to drive innovation in the business travel sector and be a part of a dynamic team shaping the future of corporate travel.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You are looking for a GCP Cloud Engineer for a position based in Pune. As a GCP Data Engineer, you will be responsible for designing, implementing, and optimizing data solutions on Google Cloud Platform. Your expertise in GCP services, solution design, and programming skills will be crucial for developing scalable and efficient cloud solutions. Your key responsibilities will include designing and implementing GCP-based data solutions following best practices, developing workflows and pipelines using Cloud Composer and Apache Airflow, building and managing data processing clusters using Dataproc, working with GCP services like Cloud Functions, Cloud Run, and Cloud Storage, and integrating multiple data sources through ETL/ELT workflows. You will be expected to write clean, efficient, and scalable code in languages such as Python, Java, or similar, apply logical problem-solving skills to address business challenges, and collaborate with stakeholders to design end-to-end GCP solution architectures. To be successful in this role, you should have hands-on experience with Dataproc, Cloud Composer, Cloud Functions, and Cloud Run, strong programming skills in Python, Java, or similar languages, a good understanding of GCP architecture, and experience in setting task dependencies in Airflow DAGs. Logical and analytical thinking, strong communication, and documentation skills are also essential for cross-functional collaboration. Preferred qualifications include GCP Professional Data Engineer or Architect Certification, experience in data lake and data warehouse solutions on GCP (e.g., BigQuery, Dataflow), and familiarity with CI/CD pipelines for GCP-based deployments.,
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
A career at HARMAN Automotive offers you the opportunity to be part of a global, multi-disciplinary team dedicated to leveraging the power of technology to shape the future. At HARMAN Automotive, you will have the chance to accelerate your career growth by engineering cutting-edge audio systems and integrated technology platforms that enhance the driving experience. By combining innovation, thorough research, and a collaborative spirit with design and engineering excellence, you will contribute to advancing in-vehicle infotainment, safety, efficiency, and enjoyment. We are currently seeking a skilled and proactive DevOps Engineer / Cloud Platform Engineer with 3 to 6 years of experience to join our team. This role requires hands-on expertise in containerization, orchestration, cloud deployment, and automation. You will play a crucial role in developing scalable, secure, and efficient cloud-native solutions using tools such as Docker, Kubernetes, and Google Cloud Platform (GCP), alongside CI/CD pipelines and Python scripting. **Responsibilities:** - Design, implement, and manage containerized applications using Docker and Kubernetes. - Develop and maintain CI/CD pipelines utilizing tools like Jenkins, GitLab CI, or GitHub Actions. - Deploy and monitor applications on GCP through services like GKE, Cloud Functions, and Cloud Build. - Automate infrastructure provisioning and configuration with tools like Terraform. - Write Python scripts for automation, monitoring, and data processing tasks. - Collaborate closely with development and QA teams to ensure seamless integration and deployment. - Monitor system performance, troubleshoot issues, and optimize resource utilization. **Requirements:** - 3-6 years of experience in DevOps, Cloud Engineering, or related fields. - Strong hands-on experience with Docker and Kubernetes in production environments. - Proficiency in GCP services including GKE, IAM, Cloud Storage, and Pub/Sub. - Experience in building and managing CI/CD pipelines. - Solid Python programming skills for scripting and automation. - Familiarity with Git, Linux, and networking fundamentals. - Understanding of security best practices in cloud and containerized environments. **Preferred Qualifications:** - Experience with Terraform, Helm, or Ansible. - Exposure to monitoring tools like Prometheus, Grafana, or ELK stack. - Knowledge of other cloud platforms (AWS, Azure). - Certifications in GCP or Kubernetes (e.g., CKA, GCP Associate Engineer). - Experience with microservices architecture and service mesh (e.g., Istio). **Qualifications for Consideration:** - Bachelors or Masters degree in Computer Science, Engineering, or related field. - Strong problem-solving skills and a proactive mindset. - Excellent communication and collaboration abilities. - Willingness to learn and adapt in a fast-paced environment. We offer a competitive salary and benefits package, opportunities for professional growth and development, a collaborative and dynamic work environment, access to cutting-edge technologies and tools, recognition and rewards for outstanding performance through BeBrilliant, and the chance to work with a renowned German OEM. Please note that this role requires you to work on-site all five days of the week. At HARMAN, we are dedicated to creating an inclusive environment where every employee feels welcomed, valued, and empowered. We encourage you to share your ideas, voice your unique perspective, and bring your authentic self to work within a supportive culture that celebrates individuality. We are committed to supporting your ongoing learning journey and offer additional opportunities for training, development, and continuing education to help you thrive in your career. HARMAN is a pioneer in unleashing next-level technology innovation since the 1920s, with a focus on amplifying the sense of sound. Our integrated technology platforms drive smarter, safer, and more connected experiences across automotive, lifestyle, and digital transformation solutions. Marketing our award-winning portfolio under 16 iconic brands like JBL, Mark Levinson, and Revel, we set the bar high by exceeding the highest engineering and design standards for our customers, partners, and employees alike. If you are ready to make a lasting impact through innovation and meaningful work, we invite you to join our talent community at HARMAN Automotive.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
navi mumbai, maharashtra
On-site
As a GCP Data Engineer at our organization, you will be a key member of our growing data team. We are looking for a highly skilled and experienced individual who is passionate about data and has a strong track record of designing, building, and maintaining scalable data solutions on Google Cloud Platform (GCP). Your role will involve transforming raw data into actionable insights, enabling data-driven decision-making throughout the organization. Your responsibilities will include designing, developing, implementing, and maintaining ETL/ELT data pipelines using various GCP services and programming languages. You will leverage Google BigQuery as a primary data warehouse, design optimal schemas, write efficient SQL queries, integrate data from diverse sources, and build, manage, and optimize ETL/ELT processes. Furthermore, you will design efficient data models in BigQuery, automate data workflows, ensure data quality and governance, optimize performance, collaborate with various teams, and ensure data security and compliance with regulations. To be successful in this role, you should have 5-7 years of experience in data engineering with a focus on GCP. You must possess hands-on expertise with GCP services such as BigQuery, Dataflow, Cloud Storage, Cloud Composer, Cloud Functions, and Pub/Sub. Strong SQL skills, understanding of ETL/ELT concepts, data modeling experience, and familiarity with version control systems are essential. Problem-solving skills, excellent communication abilities, and a collaborative mindset are also required. Preferred qualifications include a GCP Professional Data Engineer certification, experience with other cloud platforms, knowledge of Linux, familiarity with CI/CD pipelines and DevOps practices, proficiency in data visualization tools, and experience with data quality frameworks and observability tools. This role presents an exciting opportunity to work on cutting-edge data solutions in a dynamic and innovative environment. If you are a dedicated and skilled GCP Data Engineer seeking to make a significant impact, we invite you to share your resume with us at navneet@sourcebae.com.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a skilled Mobile App Developer, you will be responsible for designing and building sophisticated, highly scalable apps using Flutter. Your primary focus will be on creating custom Flutter packages that leverage the functionalities and APIs available in native Android and IOS platforms. You will be expected to translate designs and wireframes into high-quality, responsive UI code, ensuring the efficient querying of core data and the proper management of states. Utilizing your expertise in Flutter, you will explore different architectures for implementing new features and resolve any existing system issues while suggesting and integrating new functionalities. Your role will also involve suggesting space and time-efficient data structures, following best practices for app development, and documenting projects and code thoroughly. Collaboration and communication are essential aspects of this position, as you will be required to work closely with project managers, team members, and other stakeholders. You will need to manage code and projects effectively on Git, ensuring synchronization with the rest of the team. Additionally, you will be responsible for using CI/CD for smooth deployment, validating the cloud system's security, and following security guidelines throughout the development process. Your role will also involve interacting with customer-facing representatives, clinical specialists, and product managers to gather feedback and insights for continuous improvement. You will work closely with Quality Assurance Specialists to deliver a stable app, resolve reported bugs promptly, and conduct app verification protocols. Overall, you will be expected to maintain software through its product lifecycle, from design and development to verification and bug fixes, while adhering to company policies and quality procedures to ensure the delivery of high-quality products. Your commitment to writing tests for the app, performing time profiling and memory leaks assessment, and suggesting new features and enhancements will be crucial to the success of the projects you work on.,
Posted 1 week ago
16.0 - 22.0 years
16 - 22 Lacs
Hyderabad, Telangana, India
On-site
Job Description: Key Responsibilities Solution design, client engagement & delivery oversight. Senior stakeholder management Lead and drive google cloud solutioning for customer requirements, RFPs, Proposals, and delivery. Establish governance frameworks, delivery methodologies, and reusable assets to scale the practice. Ability to take initiative and deliver in challenging engagements spread across multiple geos Lead the development of differentiated capability and offerings in areas such as application modernization & migration, cloud-native development and AI agents Collaborate with sales and pre-sales teams to shape solutions and win strategic deals, including large-scale application modernization and migrations Spearhead Google Cloud latest products & services like AgentSpace, AI Agent development using GCP-native tools such as ADK, A2A Protocol, and Model Context Protocol Build and mentor a high-performing team of cloud architects, engineers, and consultants. Drive internal certifications, specialization audits, and partner assessments to maintain Google Cloud Partner status Represent the organization in partner forums, webinars, and industry/customer events. Required Qualifications 15+ years of experience in IT, with at least 3 years in google cloud applications architecture, design and solutioning At least 5+ years of experience in designing and developing Java applications/platforms Deep expertise in GCP services including Compute, Storage, BigQuery, Cloud Functions, Anthos, and Vertex AI Proven experience in leading google cloud transformation programs Strong solution architecture and implementation experience for Google Cloud modernization & migration programs Strong experience in stakeholder and client management Google Cloud certified - Google Cloud Professional architect certification Self-motivated to quickly learn the new technologies, platforms Excellent presentation and communication skill Preferred Qualifications Google Cloud Professional certifications (Cloud Architect). Experience with partner ecosystems, co-selling with Google, and managing joint GTM motions. Exposure to regulated industries (e.g., BFSI, Healthcare) and global delivery models
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Senior Software Engineer specializing in AI/ML development and leading Vertex AI & Gemini projects, you will play a crucial role in developing and deploying solutions using cutting-edge technologies. With 5-8 years of experience in Software Development/Engineering, you will be responsible for integrating GenAI components into enterprise-grade document automation workflows. Your expertise in Google Cloud Platform, Vertex AI, and Gemini models will be essential in contributing to scalable, cloud-native architectures for document ingestion, extraction, summarization, and transformation. Your future duties and responsibilities include developing and deploying solutions using Vertex AI, Gemini models, Document AI, and custom NLP/OCR components. You will also collaborate with architects, MLOps engineers, and business stakeholders to translate requirements into scalable code while ensuring secure and compliant handling of sensitive data in document processing workflows. Staying up to date with the latest Gemini/LLM advancements and integrating relevant innovations into projects will be a key aspect of your role. In order to be successful in this role, you must possess expertise in various skills including Google Cloud Platform (Vertex AI, Cloud Functions, Cloud Run, BigQuery, Document AI, Firestore), GenAI/LLMs (Google Gemini, PaLM, LangChain), OCR & NLP tools (Tesseract, GCP Document AI, spaCy, Hugging Face Transformers), Full Stack technologies (React or Next.js, Node.js or FastAPI, Firebase/Firestore), DevOps/MLOps practices (GitHub Actions, Vertex Pipelines, Docker, Terraform), and Data & Integration tools (REST APIs, GraphQL, Webhooks, Cloud Pub/Sub, JSON/Protobuf). With a solid background in full-stack development, hands-on experience in building products leveraging GenAI, NLP, and OCR, as well as proficiency in Kubernetes concepts, relational and non-relational databases, you will be well-equipped to tackle complex issues and adapt to rapidly evolving AI technologies. Your understanding of privacy regulations, security best practices, and ethical considerations in AI development will be crucial in developing production-ready systems. Additionally, experience working with Google Gemini models, document parsing, NLP, OCR, and GenAI-based transformation will further enhance your capabilities in this role. As an integral part of the team at CGI, you will have the opportunity to turn meaningful insights into action, shaping your career in a company focused on growth and innovation. With a startup mentality and a strong sense of ownership, you will contribute to delivering innovative solutions and building valuable relationships with teammates and clients, ultimately driving success in the world of IT and business consulting services.,
Posted 2 weeks ago
1.0 - 3.0 years
0 Lacs
Yelahanka, Karnataka, India
On-site
Role Description We are seeking a passionate and skilled Flutter Developer to join our development team. As a Flutter Developer, you will be responsible for designing and building cross-platform mobile applications using Flutter and Dart . You&aposll work closely with UI/UX designers and backend developers to deliver high-quality mobile experiences and ensure optimal performance and maintainability. Tech Stack Requirements Primary: Flutter & Dart (for cross-platform app development) State management tools like Provider , Riverpod , or Bloc RESTful API integration Optional Firebase services: Firestore , Firebase Auth , Cloud Functions Optional/Bonus: Knowledge of MERN Stack (MongoDB, Express.js, React.js, Node.js) Experience with platform-specific code (Android/iOS native modules) Version control using Git Responsibilities Develop robust, responsive, and scalable mobile apps using Flutter Collaborate with UI/UX designers to implement intuitive and user-friendly interfaces Integrate RESTful APIs and third-party libraries Debug, test, and optimize application performance across devices Maintain clean and reusable codebase Participate in the app store deployment process (Google Play, App Store) Work alongside backend teams and product managers to deliver features on time Good to Have Understanding of mobile app lifecycle , architecture, and best practices Experience with push notifications , in-app purchases , and device APIs Willingness to contribute to full-stack features when necessary (MERN stack experience is a plus) Familiarity with CI/CD pipelines for mobile apps Qualifications 13 years of experience in mobile development using Flutter Strong command over Dart and Flutter framework fundamentals Experience with state management and API integration Familiarity with Git and collaborative development workflows Bachelors degree in Computer Science, Engineering, or related field (or equivalent practical experience) Published apps on Play Store or App Store is a plus Show more Show less
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Python Solution Architect with over 10 years of experience, you will play a crucial role in designing and implementing scalable, high-performance software solutions that align with business requirements. Your expertise in Python frameworks (e.g., Django, Flask, FastAPI) will be instrumental in architecting efficient applications and microservices architectures. Your responsibilities will include collaborating with cross-functional teams to define architecture, best practices, and oversee the development process. You will be tasked with ensuring that Python solutions meet business goals, align with enterprise architecture, and adhere to security best practices (e.g., OWASP, cryptography). Additionally, your role will involve designing and managing RESTful APIs, optimizing database interactions, and integrating Python solutions seamlessly with third-party services and external systems. Your proficiency in cloud environments (AWS, GCP, Azure) will be essential for architecting solutions and implementing CI/CD pipelines for Python projects. You will provide guidance to Python developers on architectural decisions, design patterns, and code quality, while also mentoring teams on best practices for writing clean, maintainable, and efficient code. Preferred skills for this role include deep knowledge of Python frameworks, proficiency in asynchronous programming, experience with microservices-based architectures, and familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. Your understanding of relational and NoSQL databases, RESTful APIs, cloud services, CI/CD pipelines, and Infrastructure-as-Code tools will be crucial for success in this position. In addition, your experience with security tools and practices, encryption, authentication, data protection standards, and working in Agile environments will be valuable assets. Your ability to communicate complex technical concepts to non-technical stakeholders and ensure solutions address both functional and non-functional requirements will be key to delivering successful projects.,
Posted 2 weeks ago
7.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Description Looking for an experienced GCP Cloud/DevOps Engineer and or OpenShift to design, implement, and manage cloud infrastructure and services across multiple environments. This role requires deep expertise in Google Cloud Platform (GCP) services, DevOps practices, and Infrastructure as Code (IaC). Candidate will be deploying, automating, and maintaining high-availability systems, and implementing best practices for cloud architecture, security, and DevOps pipelines. Requirements Bachelor&aposs or master&aposs degree in computer science, Information Technology, or a similar field Must have 7 + years of extensive experience in designing, implementing, and maintaining applications on GCP and OpenShift Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team Key Responsibilities Cloud Infrastructure Design and Deployment: Architect, design, and implement scalable, reliable, and secure solutions on GCP. Deploy and manage GCP services in both development and production environments, ensuring seamless integration with existing infrastructure. Implement and manage core services such as BigQuery, Datafusion, Cloud Composer (Airflow), Cloud Storage, Data Fusion, Compute Engine, App Engine, Cloud Functions and more. Infrastructure as Code (IaC) and Automation Develop and maintain infrastructure as code using Terraform or CLI scripts to automate provisioning and configuration of GCP resources. Establish and document best practices for IaC to ensure consistent and efficient deployments across environments. DevOps And CI/CD Pipeline Development Create and manage DevOps pipelines for automated build, test, and release management, integrating with tools such as Jenkins, GitLab CI/CD, or equivalent. Work with development and operations teams to optimize deployment workflows, manage application dependencies, and improve delivery speed. Security And IAM Management Handle user and service account management in Google Cloud IAM. Set up and manage Secrets Manager and Cloud Key Management for secure storage of credentials and sensitive information. Implement network and data security best practices to ensure compliance and security of cloud resources. Performance Monitoring And Optimization Monitoring & Security: Set up observability tools like Prometheus, Grafana, and integrate security tools (e.g., SonarQube, Trivy). Networking & Storage: Configure DNS, networking, and persistent storage solutions in Kubernetes. Set up monitoring and logging (e.g., Cloud Monitoring, Cloud Logging, Error Reporting) to ensure systems perform optimally. Troubleshoot and resolve issues related to cloud services and infrastructure as they arise. Workflow Orchestration Orchestrate complex workflows using Argo Workflow Engine. Containerization: Work extensively with Docker for containerization and image management. Optimization: Troubleshoot and optimize containerized applications for performance and security. Technical Skills Expertise with GCP and OCP (OpenShift) services, including but not limited to Compute Engine, Kubernetes Engine (GKE), BigQuery, Cloud Storage, Pub/Sub, Datafusion, Airflow, Cloud Functions, and Cloud SQL. Proficiency in scripting languages like Python, Bash, or PowerShell for automation. Familiarity with DevOps tools and CI/CD processes (e.g. GitLab CI, Cloud Build, Azure DevOps, Jenkins) Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 2 weeks ago
8.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for Data Solution Architect to join FC India IT Architecture team. In this role, you will define analytics solutions and guide engineering teams to implement big data solutions on the cloud. Work involves migrating data from legacy on-prem warehouses to Google cloud data platform. This role will provide architecture assistance to data engineering teams in India, with key responsibility of supporting applications globally. This role will also drive business adoption of the new platform and sunset of legacy platforms. Responsibilities Utilize Google Cloud Platform & Data Services to modernize legacy applications. Understand technical business requirements and define architecture solutions that align to Ford Motor & Credit Companies Patterns and Standards. Collaborate and work with global architecture teams to define analytics cloud platform strategy and build Cloud analytics solutions within enterprise data factory. Provide Architecture leadership in design & delivery of new Unified data platform on GCP. Understand complex data structures in analytics space as well as interfacing application systems. Develop and maintain conceptual, logical & physical data models. Design and guide Product teams on Subject Areas and Data Marts to deliver integrated data solutions. Provide architectural guidance for optimal solutions considering regional Regulatory needs. Provide architecture assessments on technical solutions and make recommendations that meet business needs and align with architectural governance and standard. Guide teams through the enterprise architecture processes and advise teams on cloud-based design, development, and data mesh architecture. Provide advisory and technical consulting across all initiatives including PoCs, product evaluations and recommendations, security, architecture assessments, integration considerations, etc. Leverage cloud AI/ML Platforms to deliver business and technical requirements. Qualifications Google Professional Solution Architect certification. 8+ years of relevant work experience in analytics application and data architecture, with deep understanding of cloud hosting concepts and implementations. 5+ years experience in Data and Solution Architecture in analytics space. Solid knowledge of cloud data architecture, data modelling principles, and expertise in Data Modeling tools. Experience in migrating legacy analytics applications to Cloud platform and business adoption of these platforms to build insights and dashboards through deep knowledge of traditional and cloud Data Lake, Warehouse and Mart concepts. Good understanding of domain driven design and data mesh principles. Experience with designing, building, and deploying ML models to solve business challenges using Python/BQML/Vertex AI on GCP. Knowledge of enterprise frameworks and technologies. Strong in architecture design patterns, experience with secure interoperability standards and methods, architecture tolls and process. Deep understanding of traditional and cloud data warehouse environment, with hands on programming experience building data pipelines on cloud in a highly distributed and fault-tolerant manner. Experience using Dataflow, pub/sub, Kafka, Cloud run, cloud functions, Bigquery, Dataform, Dataplex , etc. Strong understanding on DevOps principles and practices, including continuous integration and deployment (CI/CD), automated testing & deployment pipelines. Good understanding of cloud security best practices and be familiar with different security tools and techniques like Identity and Access Management (IAM), Encryption, Network Security, etc. Strong understanding of microservices architecture. Nice to Have Bachelors degree in Computer science/engineering, Data science or related field. Strong leadership, communication, interpersonal, organizing, and problem-solving skills Good presentation skills with ability to communicate architectural proposals to diverse audiences (user groups, stakeholders, and senior management). Experience in Banking and Financial Regulatory Reporting space. Ability to work on multiple projects in a fast paced & dynamic environment. Exposure to multiple, diverse technologies, platforms, and processing environments. Show more Show less
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a member of the Platform Observability Engineering team within Ford's Data Platforms and Engineering (DP&E) organization, you will contribute to building and maintaining a top-tier platform for monitoring and observability. This platform focuses on the four golden signalslatency, traffic, errors, and saturationproviding essential data to support operations, root cause analysis, continuous improvement, and cost optimization. You will collaborate with platform architects to help design, develop, and maintain a scalable and reliable platform, ensuring smooth integration with systems used across various teams. Your contributions will be key in improving MTTR and MTTX through increased visibility into system performance. Working with stakeholders, you will integrate observability data into their workflows, develop insightful dashboards and reports, continuously improve platform performance and reliability, optimize costs, and stay updated with industry best practices and technologies. The role focuses on building and maintaining a robust platform rather than developing individual monitoring tools, creating a centralized, reliable source of observability data that empowers data-driven decisions and accelerates incident response across the organization. Responsibilities: - Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices supporting real-time and batch processing on GCP. - Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures for modular, flexible, and maintainable data solutions. - Full-Stack Integration: Contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. - Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring standardized and optimized data for analytics. - GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms meeting business needs. - Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP's native security features. - Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. - Collaboration and Best Practices: Define best practices, design patterns, and frameworks for cloud data engineering by closely working with data architects, software engineers, and cross-functional teams. - Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications: - Technical Skills: Proficiency in Java, Angular, or any JavaScript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. - Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. - Full-Stack Development: Knowledge of front-end and back-end technologies enabling collaboration on data access and visualization layers (e.g., React, Node.js). - Design and develop RESTful APIs for seamless integration across platform services. - Implement robust unit and functional tests to maintain high standards of test coverage and quality. - Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. - Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. - CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. - Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. - Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. - Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Principal Engineer at Walmart's Enterprise Business Services, you will play a pivotal role in shaping the engineering direction, driving architectural decisions, and ensuring the delivery of scalable, secure, and high-performing solutions across the platform. Your responsibilities will include leading the design and development of full stack applications, architecting complex cloud-native systems on Google Cloud Platform (GCP), defining best practices, and guiding engineering excellence. You will have the opportunity to work on crafting frontend experiences, building robust backend APIs, designing cloud infrastructure, and influencing the technical vision of the organization. Collaboration with product, design, and data teams to translate business requirements into scalable tech solutions will be a key aspect of your role. Additionally, you will champion CI/CD pipelines, Infrastructure as Code (IaC), and drive code quality through rigorous design reviews and automated testing. To be successful in this role, you are expected to bring 10+ years of experience in full stack development, with at least 2+ years in a technical leadership or principal engineering role. Proficiency in JavaScript/TypeScript, Python, or Go, along with expertise in modern frontend frameworks like React, is essential. Strong experience in cloud-native systems on GCP, microservices architecture, Docker, Kubernetes, and event-driven systems is required. Your role will also involve managing production-grade cloud systems, working with SQL and NoSQL databases, and staying ahead of industry trends by evaluating new tools and frameworks. Exceptional communication, leadership, and collaboration skills are crucial, along with a GCP Professional Certification and experience with serverless platforms and observability tools. Joining Walmart Global Tech means being part of a team that makes a significant impact on millions of people's lives through innovative technology solutions. You will have the opportunity to work in a flexible, hybrid environment that promotes collaboration and personal development. In addition to a competitive compensation package, Walmart offers various benefits and a culture that values diversity, inclusion, and belonging for all associates. As an Equal Opportunity Employer, Walmart fosters a workplace where unique styles, experiences, and identities are respected and valued, creating a welcoming environment for all.,
Posted 3 weeks ago
15.0 - 22.0 years
0 Lacs
karnataka
On-site
You will be responsible for solution design, client engagement, and delivery oversight, along with senior stakeholder management. Your role will involve leading and driving Google Cloud solutioning for customer requirements, RFPs, proposals, and delivery. You will establish governance frameworks, delivery methodologies, and reusable assets to scale the practice. It is essential to have the ability to take initiative and deliver in challenging engagements spread across multiple geographies. Additionally, you will lead the development of differentiated capabilities and offerings in areas such as application modernization & migration, cloud-native development, and AI agents. Collaboration with sales and pre-sales teams to shape solutions and win strategic deals, including large-scale application modernization and migrations will be a key aspect of your role. You will spearhead Google Cloud's latest products & services like AgentSpace, AI Agent development using GCP-native tools such as ADK, A2A Protocol, and Model Context Protocol. Building and mentoring a high-performing team of cloud architects, engineers, and consultants will also be part of your responsibilities. Driving internal certifications, specialization audits, and partner assessments to maintain Google Cloud Partner status is crucial. Representing the organization in partner forums, webinars, and industry/customer events is also expected from you. To qualify for this role, you should have 15+ years of experience in IT, with at least 3 years in Google Cloud applications architecture, design, and solutioning. Additionally, a minimum of 5 years of experience in designing and developing Java applications/platforms is required. Deep expertise in GCP services including Compute, Storage, BigQuery, Cloud Functions, Anthos, and Vertex AI is essential. Proven experience in leading Google Cloud transformation programs, strong solution architecture, and implementation experience for Google Cloud modernization & migration programs are important qualifications. Strong experience in stakeholder and client management is also necessary. Being Google Cloud certified with the Google Cloud Professional architect certification, self-motivated to quickly learn new technologies and platforms, and possessing excellent presentation and communication skills are crucial for this role. Preferred qualifications include Google Cloud Professional certifications (Cloud Architect), experience with partner ecosystems, co-selling with Google, and managing joint GTM motions, as well as exposure to regulated industries (e.g., BFSI, Healthcare) and global delivery models.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You will be responsible for designing and building sophisticated and highly scalable apps using Flutter. Your role will involve building custom packages in Flutter by utilizing functionalities and APIs available in native Android and IOS. It will be your responsibility to translate designs and Wireframes into high-quality responsive UI code. You will explore feasible architectures for implementing new features and ensure that best practices are followed throughout the development process. Keeping everything structured and well-documented will be crucial. Managing the code and project on Git is essential to keep in sync with other team members and managers. You will be required to communicate with the Project Manager regarding project status and suggest appropriate deadlines for new functionalities. Ensuring that security guidelines are always followed while developing the app is of utmost importance. Your role will involve maintaining software through the product lifecycle, including design, development, verification, and bug fixes. Performing time profiling and memory leaks assessment will be part of your responsibilities. Your expertise in Flutter will be utilized to build cross-platform mobile apps for Android, IOS, and Web. This will include creating responsive UIs to efficiently query data and manage states in an optimized manner. Experience with Firebase, specifically Cloud Firestore, Push Notifications, Cloud Functions, and Analytics, will be required. Proficiency in Adobe XD is necessary to utilize design files and build the app accordingly. Git will be used to manage and collaborate on different projects with the rest of the team. Experience with continuous integration and deploying apps to cloud platforms such as AWS, Azure, or others will be beneficial for this role.,
Posted 3 weeks ago
0.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
To develop a secure, low-latency Google Home integration system that connects voice commands to Firebase Realtime Database operations, enabling users to control smart devices like the Neon SmartPlug through natural speech. --- ???? Scope of Work: 1. Google Assistant Integration Create an Action on Google project (using Dialogflow or latest Actions SDK / Google Smart Home platform). Enable account linking via OAuth2 / Firebase Authentication OTP Verification. Implement voice intents for: Turning devices ON/OFF. Setting schedules or timers. Fetching status of a device. Custom interactions (e.g. Is my plug on) 2. Firebase Realtime Database Integration Sync device states in Firebase Realtime Database. Set up secure and cost-efficient data structure. Implement optimized Cloud Functions for: Intent fulfillment. Updating device state. Fetching real-time status. Logging user actions (optional analytics). 3. Cloud Functions (Node.js / TypeScript) Firebase Realtime Database Integration Write backend code to: Parse and respond to Assistant requests. Validate user sessions (via uid and linked identity). Prevent race conditions with concurrent writes. Handle fallback or unknown commands. 4. Firebase Security & User Validation Define Firebase Rules to restrict read/write based on: User uid Device ownership Action scope Ensure cross-user access is completely blocked. Implement access token validation. 5. Multi-user & Multi-device Support Support simultaneous sessions. Structure DB nodes for each user with isolation: /users/uid/devices/device_id/status 6. Testing & Validation Unit test Cloud Functions. Test integration with: Multiple Google Accounts Google Home and Android devices Show more Show less
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have 6-10 years of experience in development, specifically in Java/J2EE, with a strong knowledge of core Java. Additionally, you must be proficient in Spring frameworks, particularly in Spring MVC, Spring Boot, and JPA + Hibernate. It is essential to have hands-on experience with Microservice technology, including development of RESTFUL and SOAP Web Services. A good understanding of Oracle DB is required. Your communication skills, especially when interacting with clients, should be excellent. Experience in building tools like Maven, deployment, and troubleshooting issues is necessary. Knowledge of CI/CD tools such as Jenkins and experience with GIT or similar source control tools is expected. You should also be familiar with Agile/Scrum software development methodologies using tools like Jira, Confluence, and BitBucket and have performed Requirement Analysis. It would be beneficial to have knowledge of frontend stacks like React or Angular, as well as frontend and backend API integration. Experience with AWS, CI/CD best practices, and designing security reference architectures for AWS Infrastructure Applications is advantageous. You should possess good verbal and written communication skills, the ability to multitask in a fast-paced environment, and be highly organized and detail-oriented. Awareness of common information security principles and practices is required. TELUS International is committed to creating a diverse and inclusive workplace and is an equal opportunity employer. All employment decisions are based on qualifications, merits, competence, and performance without regard to any characteristic related to diversity.,
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
kolkata, west bengal
On-site
As a Solution Architect & Technical Lead at RebusCode, you will play a crucial role in driving the design and architecture of our Big Data Analytics solutions within the Market Research industry. Your responsibilities will include providing technical leadership, ensuring governance, documenting solutions, and sharing knowledge effectively. Moreover, you will be actively involved in project management and ensuring timely delivery of projects. To excel in this role, you should have a minimum of 5 years of experience in software development, out of which at least 2 years should be in architecture or technical leadership positions. A proven track record of delivering enterprise-grade, cloud-native SaaS applications on Azure and/or GCP is essential for this role. Your technical skills should encompass a wide range of areas including Cloud & Infrastructure (Azure App Services, Functions, Kubernetes; GKE, Cloud Functions; Service Bus, Pub/Sub; Blob Storage, Cloud Storage; Key Vault, Secret Manager; CDN), Development Stack (C#/.NET 6/7/8, ASP.NET Core Web API, Docker, container orchestration), Data & Integration (SQL Server, Oracle, Cosmos DB, Spanner, BigQuery, ETL patterns, message-based integration), CI/CD & IaC (Azure DevOps, Cloud Build, GitHub Actions; ARM/Bicep, Terraform; container registries, automated testing), Security & Compliance (TLS/SSL certificate management, API gateway policies, encryption standards), and Monitoring & Performance (Azure Application Insights, Log Analytics, Stackdriver, performance profiling, load testing tools). Nice-to-have qualifications include certifications such as Azure Solutions Architect Expert, Google Professional Cloud Architect, PMP or PMI-ACP. Familiarity with front-end frameworks like Angular and React, as well as API client SDK generation, would be an added advantage. Prior experience in building low-code/no-code integration platforms or automation engines is also beneficial. Exposure to alternative clouds like AWS or on-prem virtualization platforms like VMware and OpenShift will be a plus. Join us at RebusCode, where you will have the opportunity to work on cutting-edge Big Data Analytics solutions and contribute to the growth and success of our market research offerings.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Google Cloud Engineer at our company, you will play a crucial role in designing, building, deploying, and maintaining our cloud infrastructure and applications on Google Cloud Platform (GCP). Your collaboration with development, operations, and security teams will ensure that our cloud environment is scalable, secure, highly available, and cost-optimized. If you are enthusiastic about cloud-native technologies, automation, and overcoming intricate infrastructure challenges, we welcome you to apply. Your responsibilities will include: - Designing, implementing, and managing robust, scalable, and secure cloud infrastructure on GCP utilizing Infrastructure as Code (IaC) tools like Terraform. - Deploying, configuring, and managing core GCP services such as Compute Engine, Kubernetes Engine (GKE), Cloud SQL, Cloud Storage, Cloud Functions, BigQuery, Pub/Sub, and networking components. - Developing and maintaining CI/CD pipelines for automated deployment and release management using various tools. - Implementing and enforcing security best practices within the GCP environment, including IAM, network security, data encryption, and compliance adherence. - Monitoring cloud infrastructure and application performance, identifying bottlenecks, and implementing optimization solutions. - Troubleshooting and resolving complex infrastructure and application issues in production and non-production environments. - Collaborating with development teams to ensure cloud-native deployment, scalability, and resilience of applications. - Participating in on-call rotations for critical incident response and timely resolution of production issues. - Creating and maintaining comprehensive documentation for cloud architecture, configurations, and operational procedures. - Keeping up-to-date with new GCP services, features, and industry best practices to propose and implement improvements. - Contributing to cost optimization efforts by identifying and implementing efficiencies in cloud resource utilization. We require you to have: - A Bachelors or Masters degree in Computer Science, Software Engineering, or a related field. - 6+ years of experience with C#, .NET Core, .NET Framework, MVC, Web API, Entity Framework, and SQL Server. - 3+ years of experience with cloud platforms, preferably GCP, including designing and deploying cloud-native applications. - 3+ years of experience with source code management, CI/CD pipelines, and Infrastructure as Code. - Strong experience with Javascript and a modern Javascript framework, with VueJS preferred. - Proven leadership and mentoring skills with development teams. - Strong understanding of microservices architecture and serverless computing. - Experience with relational databases like SQL Server and PostgreSQL. - Excellent problem-solving, analytical, and communication skills, along with Agile/Scrum environment experience. What can make you stand out: - GCP Cloud Certification. - UI development experience with HTML, JavaScript, Angular, and Bootstrap. - Agile environment experience with Scrum, XP. - Relational database experience with SQL Server, PostgreSQL. - Proficiency in Atlassian tools like JIRA, Confluence, and Github. - Working knowledge of Python and exceptional problem-solving and analytical abilities, along with strong teamwork skills.,
Posted 3 weeks ago
5.0 - 13.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled and experienced Cloud Architect/Engineer with deep expertise in Google Cloud Platform (GCP). Your primary responsibility is to design, build, and manage scalable and reliable cloud infrastructure on GCP. You will leverage various GCP services such as Compute Engine, Cloud Run, BigQuery, Pub/Sub, Cloud Functions, Dataflow, Dataproc, IAM, and Cloud Storage to ensure high-performance cloud solutions. Your role also includes developing and maintaining CI/CD pipelines, automating infrastructure deployment using Infrastructure as Code (IaC) principles, and implementing best practices in cloud security, monitoring, performance tuning, and logging. Collaboration with cross-functional teams to deliver cloud solutions aligned with business objectives is essential. You should have 5+ years of hands-on experience in cloud architecture and engineering, with at least 3 years of practical experience on Google Cloud Platform (GCP). In-depth expertise in GCP services mentioned above is required. Strong understanding of networking, security, containerization (Docker, Kubernetes), and CI/CD pipelines is essential. Experience with monitoring, performance tuning, and logging in cloud environments is preferred. Familiarity with DevSecOps practices and tools such as HashiCorp Vault is a plus. Your role as a GCP Cloud Architect/Engineer will contribute to ensuring system reliability, backup, and disaster recovery strategies. This hybrid role is based out of Pune and requires a total of 10 to 13 years of relevant experience.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City