Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
10 - 18 Lacs
Hyderabad
Hybrid
About the Role: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4-6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Posted 11 hours ago
7.0 - 9.0 years
8 - 10 Lacs
Navi Mumbai
Work from Office
Develop scalable APIs and backend systems for AI-powered apps. Handle authentication, content automation, language handling, media streaming, and AI API integrations. Collaborate with mobile and AI teams in a fast-paced environment.
Posted 3 days ago
7.0 - 9.0 years
10 - 12 Lacs
Navi Mumbai
Hybrid
Build high-performance mobile apps using Flutter. Integrate AI APIs, manage backend tasks, and handle native module development. Work in a fast-paced setup with full ownership of features and cross-functional collaboration.
Posted 3 days ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Key Skills: 3 years of experience with building modern applications utilizing GCP services like Cloud Build, Cloud Functions/ Cloud Run, GKE, Logging, GCS, CloudSQL & IAM. Primary proficiency in Python and experience with a secondary language such as Golang or Java. In-depth knowledge and hands-on experience with GKE/K8s. You place a high emphasis on Software Engineering fundamentals such as code and configuration management, CICD/Automation and automated testing. Working with operations, security, compliance and architecture groups to develop secure, scalable and supportable solutions. Working and delivering solution in a complex enterprise environment. Proficiency in designing and developing scalable and decoupled microservices and adeptness in implementing event-driven architecture to ensure seamless and responsive service interactions. Proficiency in designing scalable and robust solutions leveraging cloud-native technologies and architectures. Expertise in managing diverse stakeholder expectations and adept at prioritizing tasks to align with strategic objectives and deliver optimal outcomes. Good to have knowledge, skills and experiences The ‘good to have’ knowledge, skill and experience (KSE) the role requires are: Ability to integrate Kafka to handle real-time data. Proficiency in monitoring tools Experience using Robot Framework for automated UAT is highly desirable.
Posted 3 days ago
5.0 - 8.0 years
5 - 8 Lacs
Mumbai, Maharashtra, India
On-site
Key Accountabilities Design, create, code, and support a variety of GCP, ETL, and SQL solutions. Apply agile techniques or methods to project execution. Collaborate effectively in a distributed global team environment. Communicate technical concepts effectively with business stakeholders and influence decision-making. Analyze existing processes and development requirements to enhance efficiency. Manage multiple stakeholders and tasks, navigating ambiguity and complexity. Translate business needs into insights by collaborating with architects, solution managers, and analysts. Maintain strong technical skills and share knowledge within the team. Resolve issues by working with system users, IT department, vendors, and service providers. Support existing data warehouse jobs and related processes. Utilize task/job scheduling tools like Talend, Tidal, Airflow, and Linux. Lead small projects/initiatives and contribute to enterprise implementations. Research modern development technologies and techniques proactively. Foster a continuous improvement and automation mindset to streamline processes. Train internal teams, IT functions, and business users. Be familiar with real-time and streaming data processes. Minimum Qualifications 58+ years of relevant experience as a Data Engineer or similar role. Hands-on experience with modern cloud data engineering services. Understanding of SAP landscape and data governance tools. Basic understanding of cybersecurity requirements. Excellent communication, analytical, and stakeholder management skills. Skill Proficiency Expert Level SQL Python Data Warehousing Concepts Intermediate Level GCP (Cloud Storage, Modeling, Real-time) BigQuery S3 / Blob Storage Composer Cloud Functions (Lambda/Azure Function) dbt Basic Level / Preferred Data Modeling Concepts Preferred Qualifications GCP Data Engineer certification Understanding of the CPG (Consumer Packaged Goods) industry
Posted 4 days ago
5.0 - 12.0 years
5 - 12 Lacs
Mumbai, Maharashtra, India
On-site
KEY ACCOUNTABILITIES Design, create, code, and support a variety of data pipelines and models on any cloud technology (GCP preferred) Partner with business analysts, architects, and other key project stakeholders to deliver business initiatives Seeks to learn new skills, mentor newer team members, build domain expertise and document processes Actively builds knowledge of D&T resources, people, and technology Participate in the evaluation, implementation and deployment of emerging tools & process in the big data space Collaboratively troubleshoot technical and performance issues in the data space Leans into ambiguity and partners with others to find solutions Ability to identify opportunities to contribute work to the broader GMI data community Ability to manage multiple stakeholders, tasks and navigate through ambiguity & complexity Able to lead small projects/initiatives and contribute/lead effectively to the implementation of enterprise projects. Support existing Data warehouses & related jobs. Familiarity with real time and streaming data processes Proactive research into up-to-date technology or techniques for development Should have automation mindset to embrace a Continuous Improvement mentality to streamline & eliminate waste in all processes. MINIMUM QUALIFICATIONS Identified as the technical /project lead for global projects Actively coaches and mentors team of developers Pro-actively identifies potential issues / deadline slippage /opportunities in projects/tasks and takes timely decisions Demonstrates strong affinity towards paying attention to details and delivery accuracy Collaborates with the business stakeholders and develop strong working relationships Self-motivated team player and should have ability to overcome challenges and achieve desired results 5-12 years of total experience in ETL/Data Space, Min. 2+ relevant experience in Cloud Space Excellent communication skills- verbal and written Excellent analytical skills Expert Level - Cloud (Storage, Modelling, Real time) GCP Preferred Data Storage (S3 / Blob Storage) Big Query SQL Composer Cloud Functions (Lambda/Azure function) Data Warehousing Concepts Intermediate Level - Python Kafka, Pub/Sub Basic Level - dBT PREFERRED QUALIFICATIONS GCP Data Engineer certification, GCP certification Understanding of CPG industry
Posted 4 days ago
0.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About VOIS: VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group's partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India: In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Description Role purpose: Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. -- Strong communicator, experienced in leading & negotiating decision and effective outcomes. -- Strong overarching Data Architecture knowledge and experience with ability to govern application of architecture principles within projects VOIS Equal Opportunity Employer Commitment India: VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we'll be in touch!
Posted 5 days ago
1.0 - 6.0 years
1 - 6 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
We are looking for a Senior Flutter & Firebase Developer with expertise in full-stack mobile development. The ideal candidate should be skilled in building scalable, secure applications while leading technical decisions and ensuring UI excellence. Primary Skills (Must-Have) Flutter/Dart: Strong expertise in Flutter framework and Dart programming. State Management: Proficiency in BLoC/Redux for managing app states. Firebase Services: Experience with Firestore, Authentication, Cloud Functions, and Real-time Database. Cloud Architecture: Ability to design scalable and secure backend solutions. Security Implementation: Knowledge of app security best practices. UI/UX Mastery: Ability to craft pixel-perfect and highly responsive UI. Technical Leadership: Experience leading teams and making architectural decisions. Secondary Skills (Good-to-Have) RESTful APIs & GraphQL Backend Development (Node.js/Python/Go) CI/CD Pipelines & DevOps Docker & Kubernetes Performance Optimization & Debugging Unit & Integration Testing Role & Responsibilities Develop and maintain high-performance mobile applications using Flutter. Design, architect, and implement scalable Firebase-based solutions. Ensure security best practices and optimize app performance. Lead technical discussions and mentor junior developers. Collaborate with cross-functional teams for seamless development
Posted 5 days ago
4.0 - 6.0 years
5 - 8 Lacs
Gurugram
Work from Office
Required Skills: Strong expertise in NestJS framework. Proficient in building and managing Microservices architecture. Hands-on experience with Apache Kafka for real-time data streaming and messaging. Experience with Google Cloud Platform (GCP) services, including but not limited to Cloud Functions, Cloud Run, Pub/Sub, BigQuery, and Kubernetes Engine. Familiarity with RESTful APIs, database systems (SQL/NoSQL), and performance optimization. Solid understanding of version control systems, particularly Git. Preferred Skills: Knowledge of containerization using Docker. Experience with automated testing frameworks and methodologies. Understanding of monitoring, logging, and observability tools and practices. Responsibilities: Design, develop, and maintain backend services using NestJS within a microservices architecture. Implement robust messaging and event-driven architectures using Kafka. Deploy, manage, and optimize applications and services on Google Cloud Platform. Ensure high performance, scalability, reliability, and security of backend services. Collaborate closely with front-end developers, product managers, and DevOps teams. Write clean, efficient, and maintainable code, adhering to best practices and coding standards. Perform comprehensive testing and debugging, addressing production issues promptly. Job Location is in Office & based out of Gurgaon Selected candidate needs to have own Laptop
Posted 1 week ago
4.0 - 6.0 years
14 - 24 Lacs
Hyderabad
Hybrid
Job Overview: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server , Oracle , or PostgreSQL . Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 46 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture , data integration , and transformation techniques . Experience in working with version control systems like GitHub and knowledge of CI/CD practices . Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools. Additional Details: Excellent problem-solving and analytical skills. Strong communication skills and ability to collaborate in a team environment.
Posted 1 week ago
2.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
Key Responsibilities:Cloud Infrastructure Management:o Design, deploy, and manage scalable and secure infrastructure on Google Cloud Platform (GCP).o Implement best practices for GCP IAM, VPCs, Cloud Storage, Clickhouse, Superset Apache tools onboarding and other GCP services.Kubernetes and Containerization:o Manage and optimize Google Kubernetes Engine (GKE) clusters for containerized applications. Implement Kubernetes best practices, including pod scaling, resource allocation, and security policies.CI/CD Pipelines:o Build and maintain CI/CD pipelines using tools like Cloud Build, Stratus, GitLab CI/CD, or ArgoCD.o Automate deployment workflows for containerized and serverless applications.Security and Compliance:o Ensure adherence to security best practices for GCP, including IAM policies, network security, and data encryption. Conduct regular audits to ensure compliance with organizational and regulatory standards. Collaboration and Support:o Work closely with development teams to containerize applications and ensure smooth deployment on GCP.o Provide support for troubleshooting and resolving infrastructure-related issues.Cost Optimization:o Monitor and optimize GCP resource usage to ensure cost efficiency.o Implement strategies to reduce cloud spend without compromising performance.Required Skills and Qualifications:Certifications:o Must hold a Google Cloud Professional DevOps Engineer certification or Google Cloud Professional Cloud Architect certification.Cloud Expertise:o Strong hands-on experience with Google Cloud Platform (GCP) services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub.DevOps Tools:o Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build.o Experience with containerization tools like Docker.Kubernetes Expertise:o In-depth knowledge of Kubernetes concepts such as pods, deployments, services, ingress, config maps, and secrets.o Familiarity with Kubernetes tools like kubectl, Helm, and Kustomize.5. Programming and Scripting:o Strong scripting skills in Python, Bash, or Go.o Familiarity with YAML and JSON for configuration management.Monitoring and Logging:o Experience with monitoring tools like Prometheus, Grafana, or Google Cloud Operations Suite.Networking:o Understanding of cloud networking concepts, including VPCs, subnets, firewalls, and load balancers. Soft Skills: Strong problem-solving and troubleshooting skills.o Excellent communication and collaboration abilities.o Ability to work in an agile, fast-paced environment.
Posted 2 weeks ago
6.0 - 10.0 years
12 - 18 Lacs
Hyderabad
Hybrid
Role & Responsibilities Role Overview: We are seeking a talented and forward-thinking DevOps Engineer for one of the large financial services GCC based in Hyderabad with responsibilities including designing, implementing, and maintaining CI/CD pipelines, monitoring system performance, automating deployments, ensuring infrastructure scalability and security, collaborating with development and IT teams, and optimizing workflow efficiency. Technical Requirements: Experienced in setting and delivering DevOps strategy Proficient in collaborating with engineering teams to understand their needs Skilled in setting up, maintaining, optimizing, and evolving DevOps tooling and infrastructure Strong knowledge of automating development, quality engineering, deployment, and release processes Familiarity with Agile and Waterfall methodologies and supporting toolchains Ability to identify technical problems and develop effective solutions Hands-on experience with a variety of technologies including Git, Kubernetes, Docker, Jenkins, and scripting/programming languages Competence in implementing DevOps and Agile patterns such as CI/CD pipelines, source code management, automation, and infrastructure as code Understanding of IT management practices, software currency, and security measures Experience in GCP infrastructure, Terraform, Harness for CI/CD automation, and deployments Proficiency in team leadership, communication, and problem-solving skills Functional Requirements: Demonstrated team leadership and DevOps experience Exposure to GCP infrastructure including Compute Engine, VPC, IAM, Cloud Functions, and GKE Hands-on experience with various DevOps technologies such as Git, Kubernetes, Docker, Jenkins, SonarQube, and scripting/programming languages Strong organizational, time management, and multitasking skills Ability to work collaboratively, build relationships, and adapt to various domains and disciplines Passion for developing new technologies and optimizing software delivery processes Understanding of security compliance, networking, and firewalls Willingness to learn, grow, and develop within a supportive and inclusive environment Ability to propose new technologies and methodologies for software delivery optimization This role offers a compelling opportunity for a seasoned DevOps Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification Engineering Grad / Postgraduate CRITERIA Helm experience Networking and security (firewalls, IAM roles) experience Security compliance understanding Relevant Experience: 6-9 years
Posted 2 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Hybrid
Key Responsibilities: 1. Cloud Infrastructure Management: o Design, deploy, and manage scalable and secure infrastructure on Google Cloud Platform (GCP). o Implement best practices for GCP IAM, VPCs, Cloud Storage, Clickhouse, Superset Apache tools onboarding and other GCP services. 2. Kubernetes and Containerization: o Manage and optimize Google Kubernetes Engine (GKE) clusters for containerized applications. o Implement Kubernetes best practices, including pod scaling, resource allocation, and security policies. 3. CI/CD Pipelines: o Build and maintain CI/CD pipelines using tools like Cloud Build, Stratus, GitLab CI/CD, or ArgoCD. o Automate deployment workflows for containerized and serverless applications. 4. Security and Compliance: o Ensure adherence to security best practices for GCP, including IAM policies, network security, and data encryption. o Conduct regular audits to ensure compliance with organizational and regulatory standards. 5. Collaboration and Support: o Work closely with development teams to containerize applications and ensure smooth deployment on GCP. o Provide support for troubleshooting and resolving infrastructure-related issues. 6. Cost Optimization: o Monitor and optimize GCP resource usage to ensure cost efficiency. o Implement strategies to reduce cloud spend without compromising performance. ________________________________________ Required Skills and Qualifications: 1. Certifications: o Must hold a Google Cloud Professional DevOps Engineer certification or Google Cloud Professional Cloud Architect certification. 2. Cloud Expertise: o Strong hands-on experience with Google Cloud Platform (GCP) services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub. 3. DevOps Tools: o Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build. o Experience with containerization tools like Docker. 4. Kubernetes Expertise: o In-depth knowledge of Kubernetes concepts such as pods, deployments, services, ingress, config maps, and secrets. o Familiarity with Kubernetes tools like kubectl, Helm, and Kustomize. 5. Programming and Scripting: o Strong scripting skills in Python, Bash, or Go. o Familiarity with YAML and JSON for configuration management. 6. Monitoring and Logging: o Experience with monitoring tools like Prometheus, Grafana, or Google Cloud Operations Suite. 7. Networking: o Understanding of cloud networking concepts, including VPCs, subnets, firewalls, and load balancers. 8. Soft Skills: o Strong problem-solving and troubleshooting skills. o Excellent communication and collaboration abilities. o Ability to work in an agile, fast-paced environment.
Posted 2 weeks ago
8.0 - 13.0 years
8 - 13 Lacs
Mumbai, Maharashtra, India
On-site
Key Responsibilities Design & Implementation: Design and implement robust, scalable data engineering pipelines within GCP , specifically tailored for web analytics. Data Integration: Integrate a variety of marketing & enterprise data sources, including AWS Aurora, SAP, and Salesforce , ensuring seamless data flow and accessibility. End-to-End Project Execution: Execute at least 1-2 end-to-end data engineering projects , from conceptualization to deployment, demonstrating strong project management and technical prowess. Scripting & Optimization: Employ advanced SQL, Python, and Bash scripting to optimize data processing, analysis, and automation tasks. Cross-functional Collaboration: Collaborate with cross-functional teams to identify data needs, design effective solutions, and enhance data-driven decision-making capabilities. Data Integrity & Security: Maintain and ensure the integrity and reliability of data pipelines, implementing best practices in data security and compliance. Qualifications Education: Bachelor's/Master's in computer/allied STEM engineering fields. (BS in Mathematics, Statistics, Computer Science, Engineering, Data Science, Analytics, or related field preferred). Experience: Relevant 4-6 years of proven experience as a Data Engineer, with a specific focus on web analytics data engineering pipelines. Cloud Proficiency (GCP & AWS): Hands-on experience with GCP & AWS , including their respective data processing and analytics services. GCP Services: Pub/Sub, Dataflow, BigQuery, Cloud Functions, Cloud Run, Bigtable, etc. AWS Services: Glue, Redshift, S3, SQS, Aurora, etc. API Integration: Strong proficiency in integrating data from diverse sources using APIs . Scripting Expertise: Advanced knowledge of SQL, Python, and Bash scripting is essential. Project Execution: Demonstrated ability to execute end-to-end projects, showcasing effective project management and technical skills. Problem-Solving: Excellent problem-solving abilities, with a keen eye for detail and a commitment to high-quality outcomes. Communication: Strong communication skills, with the ability to convey complex technical concepts in a clear, concise manner.
Posted 2 weeks ago
5.0 - 10.0 years
6 - 12 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines for high-volume data processing Optimize and automate data ingestion , transformation , and storage workflows Handle both structured and unstructured data sources , ensuring data quality and consistency Develop and maintain data models , data warehouses , and databases Collaborate with cross-functional teams to support and enable data-driven decision-making Ensure data security , privacy , and compliance with industry and regulatory standards Troubleshoot and resolve data-related issues promptly and efficiently Monitor and enhance system performance , reliability , and scalability Stay up-to-date with emerging data technologies and recommend improvements to data architecture and engineering practices What You Will Need 5+ years of experience in data engineering , ETL development , or a related field Strong programming skills in Python Proficiency in SQL and experience with both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB) Proven experience building data pipelines on Google Cloud Platform (GCP) using services like: DataFlow , Cloud Batch , BigQuery , BigTable , Cloud Functions , Cloud Workflows , Cloud Composer Solid understanding of data modeling , data warehousing , and data governance principles Capability to mentor junior data engineers and assist with technical challenges Familiarity with orchestration tools such as Apache Airflow Experience with containerization and orchestration tools like Docker and Kubernetes Proficiency with version control systems (e.g., Git) and CI/CD pipelines Excellent problem-solving and communication skills Ability to work effectively in a fast-paced , agile environment Experience with Snowflake , big data technologies (e.g., Hadoop, Spark, Kafka), and AWS is a plus Skilled at converting business requirements into technical documentation Education and Experience Bachelor's degree in Computer Science , Information Systems , Information Technology , or a related field Certified development training/program is a plus 5+ years of hands-on experience building data pipelines using Python and GCP
Posted 2 weeks ago
3.0 - 4.0 years
4 - 6 Lacs
Chennai
Work from Office
Overview Require Front end UI Flutter developer Responsibilities 3 to 4 years of professional experience in Flutter development . Proficiency in Dart programming language. Solid understanding of state management (e.g., Provider, Bloc, Riverpod). Experience with API integration , especially RESTful services. Hands-on experience with version control systems like Git. Familiarity with CI/CD pipelines for mobile apps. Experience publishing apps on Google Play Store and Apple App Store . Understanding of responsive UI/UX and mobile performance optimization. Essential skills 3 to 4 years of professional experience in Flutter development . Proficiency in Dart programming language. Solid understanding of state management (e.g., Provider, Bloc, Riverpod). Experience with API integration , especially RESTful services. Hands-on experience with version control systems like Git. Familiarity with CI/CD pipelines for mobile apps. Experience publishing apps on Google Play Store and Apple App Store . Understanding of responsive UI/UX and mobile performance optimization. Desired skills Experience with native Android (Kotlin/Java) or iOS (Swift) development. Knowledge of Firebase (Auth, Firestore, Cloud Functions, etc.). Familiarity with Agile/Scrum development processes. Understanding of app security best practices. Experience
Posted 3 weeks ago
2.0 - 6.0 years
6 - 12 Lacs
Noida, Bengaluru, Mumbai (All Areas)
Work from Office
About the Job Who we are and what do we do Innovation in every byte NPST is a fintech company bridging the banking and fintech worlds with its product suite of technology and payments, for over 10 years. We provide software and digital payment solutions to the BFSI Industry as a Technology service provider. We function as a Technology Service Provider (TSP) and a Third-Party Aggregator Provider (TPAP), catering to stakeholders across the financial value chain, including banks, merchant aggregators, merchants, and consumers. We got listed targeting SME IPO in Aug 2021 on the NSE Emerge platform with a market cap of 2000 Cr (as of Mar24) and became NPCI- an approved Merchant Payment Service Provider, acquiring merchants and facilitating payment. NPST has a marquee clientele having 10 Banks and 30+ PAPG and Merchants. We believe, Technology drives generations making lives simpler and efficient and aim to change lives and build financially inclusive societies. What will you do We are augmenting our team and actively looking for a Flutter Developer who can add his skills & experience in strengthening our current team. Job responsibilities: Design and Build scalable apps using Flutter and Dart. Have knowledge on generating flutter plugins to interact with native (iOS/Android) libraries. Have worked with Flutter app with different Architectures, Architectural design. Experience with Android SDK, Kotlin` Use Flutter to build cross platform mobile apps for Android, IOS and Web. This should include from making responsive UIs to efficiently query data and mange states in an optimized manner. To build custom Flutter Packages. Translate and build the designs and Wireframes into high-quality responsive UI code. Write efficient queries for core Data. Firebase: Should have experience with Cloud Firestore, Push Notifications, Cloud Functions and Analytics. Creating and implementing API of Firebase Firestore Should have some basic knowledge of core language of native Android and iOS. Git: To manage and collaborate in different projects with the rest of the team. What are we looking for: Person coming from Strong knowledge & experience in Flutter. Deep knowledge and experience of iOS or android development with flutter Experience of Firebase and other deployment cycles Strong educational background, preferably in the fields of computer science or engineering. Proven working experience in same. Strong technical background, with understanding or hands-on experience in software development and web technologies. Experience in mobile application development. Good communication skills. Entrepreneurial skills, ability to observe, innovate and own your work. Detail-oriented and organized with strong time management skills. Influencing skills and the ability to create positive working relationships with team members at all levels. Excellent communication and interpersonal skills. A collaborative approach and work with perfection as a group effort to achieve organization goal. Education Qualification - Bachelor's degree Experience – 3-5 years. Industry- IT/Software/BFSI/ Banking /Fintech Work arrangement – 5 days working from office. Location – Bengaluru/ Mumbai/ Noida What do we offer: An organization where we strongly believe in one organization, one goal. A fun workplace which compels us to challenge ourselves and aim higher. A team that strongly believes in collaboration and celebrating success together. Benefits that resonate ‘We Care’. If this opportunity excites you, we invite you to apply and contribute to our success story. If your resume is shortlisted, you will hear back from us.
Posted 3 weeks ago
8.0 - 10.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 327246 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP & GKE Staff Engineer to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Title / Role: GCP & GKE Staff Engineer NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Description: Primary Skill: Professional Cloud Security Engineer & Cloud-Infrastructure-Google Cloud Platform Related experience: 5+ years of experience in cloud security engineering and automation Total Experience: 8+ Years Must have GCP Solution Architect Certification & Professional Cloud Security Engineer Mandatory Skills: Technical Qualification/ Knowledge: This role supports operational security, control configuration, and secure design practices for GCP workloads. Roles & Responsibilities Implement GCP security controls: IAM, VPC security, VPNs, KMS, Cloud Armor, and secure networking. Manage GCP identity and access, including SSO, MFA, and federated IDP configurations. Monitor workloads using Cloud Operations Suite and escalate anomalies. Conduct basic threat modelling, vulnerability scanning, and patching processes. Automate security audits and compliance controls using Terraform and Cloud Shell scripting. Assist architects in deploying and maintaining secure-by-default infrastructure. Support audit preparation, policy enforcement, and evidence gathering. Collaborate with cross-functional teams to resolve security alerts and Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security, DR/business continuity strategy, migration, templates, cost optimization, PowerShell, Ansible etc.. Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design, build, migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine, Compute Engine Managed Instance Groups, Kubernetes Cloud Storage, Cloud Storage for Firebase, Persistant Disk, Local SSD, Filestore, Transfer Service Virtual Private Network (VPC), Cloud DNS, Cloud Interconnect, Cloud VPN Gateway, Network Load Balancing, Global load balancing, Firewall rules, Cloud Armor Cloud IAM, Resource Manager, Multi-factor Authentication, Cloud KMS Cloud Billing, Cloud Console, Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE), Anthos Service Mesh, Cloud Functions, PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE .Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. .Manage Kubernetes Objects .Declarative and imperative paradigms for interacting with the Kubernetes API. .Managing Secrets .Managing confidential settings data using Secrets. .Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. . Configure networking for your cluster. . Hands-on experience with terraform. Ability to write reusable terraform modules. . Hands-on Python and Unix shell scripting is required. . understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. . Experience with GCP Services and writing cloud functions. . Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. . Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus . Experience using Docker within container orchestration platforms such as GKE. . Knowledge of setting up splunk . Knowledge of Spark in GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes Soft Skills: Excellent communication skill and capability to work directly with global customers Strong technical leadership skill to drive solutions Focused on quality/cost/time of deliverables Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation Good reporting skill Willing to work in different time zones as per project requirement Good attitude to work in team and as individual contributor based on the project and situation Focused, result oriented and self-motivating About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 4 weeks ago
5.0 - 7.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Req ID: 326830 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Security Architect to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Title / Role: GCP & GKE Staff Engineer NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Description: Primary Skill: Professional Cloud Security Engineer & Cloud-Infrastructure-Google Cloud Platform Related experience: 5+ years of experience in cloud security engineering and automation Total Experience: 8+ Years Must have GCP Solution Architect Certification & Professional Cloud Security Engineer Mandatory Skills: Technical Qualification/ Knowledge: This role supports operational security, control configuration, and secure design practices for GCP workloads. Roles & Responsibilities Implement GCP security controls: IAM, VPC security, VPNs, KMS, Cloud Armor, and secure networking. Manage GCP identity and access, including SSO, MFA, and federated IDP configurations. Monitor workloads using Cloud Operations Suite and escalate anomalies. Conduct basic threat modelling, vulnerability scanning, and patching processes. Automate security audits and compliance controls using Terraform and Cloud Shell scripting. Assist architects in deploying and maintaining secure-by-default infrastructure. Support audit preparation, policy enforcement, and evidence gathering. Collaborate with cross-functional teams to resolve security alerts and Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell , Ansible etc.. Should have prior experience in executing large complex cloud transformation programs including discovery, assessment , business case creation , design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine , Compute Engine Managed Instance Groups , Kubernetes Cloud Storage , Cloud Storage for Firebase , Persistant Disk , Local SSD , Filestore , Transfer Service Virtual Private Network (VPC), Cloud DNS , Cloud Interconnect , Cloud VPN Gateway , Network Load Balancing , Global load balancing , Firewall rules , Cloud Armor Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE . Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. . Manage Kubernetes Objects . Declarative and imperative paradigms for interacting with the Kubernetes API. . Managing Secrets . Managing confidential settings data using Secrets. . Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. . Configure networking for your cluster. . Hands-on experience with terraform. Ability to write reusable terraform modules. . Hands-on Python and Unix shell scripting is required. . understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. . Experience with GCP Services and writing cloud functions. . Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. . Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus . Experience using Docker within container orchestration platforms such as GKE. . Knowledge of setting up splunk . Knowledge of Spark in GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes Soft Skills: Excellent communication skill and capability to work directly with global customers Strong technical leadership skill to drive solutions Focused on quality/cost/time of deliverables Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation Good reporting skill Willing to work in different time zones as per project requirement Good attitude to work in team and as individual contributor based on the project and situation Focused, result oriented and self-motivating About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 4 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Chennai, Tamil Nadu
Work from Office
Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent
Posted 1 month ago
1.0 - 6.0 years
2 - 7 Lacs
Gurugram
Work from Office
We are looking for a Senior Flutter & Firebase Developer with expertise in full-stack mobile development. The ideal candidate should be skilled in building scalable, secure applications while leading technical decisions and ensuring UI excellence. #PrimarySkills (Must-Have) Flutter/Dart Strong expertise in Flutter framework and Dart programming. State Management Proficiency in BLoC/Redux for managing app states. Firebase Services Experience with Firestore, Authentication, Cloud Functions, and Real-time Database. Cloud Architecture Ability to design scalable and secure backend solutions. Security Implementation Knowledge of app security best practices. UI/UX Mastery Ability to craft pixel-perfect and highly responsive UI. Technical Leadership Experience leading teams and making architectural decisions. #SecondarySkills (Good-to-Have) RESTful APIs & GraphQL Backend Development (Node.js/Python/Go) CI/CD Pipelines & DevOps Docker & Kubernetes Performance Optimization & Debugging Unit & Integration Testing #Role & Responsibilities Develop and maintain high-performance mobile applications using Flutter. Design, architect, and implement scalable Firebase-based solutions. Ensure security best practices and optimize app performance. Lead technical discussions and mentor junior developers. Collaborate with cross-functional teams for seamless development.
Posted 1 month ago
10 - 15 years
25 - 40 Lacs
Pune
Work from Office
Introduction: We are seeking a highly skilled and experienced Google Cloud Platform (GCP) Solution Architect . As a Solution Architect, you will play a pivotal role in designing and implementing cloud-based solutions for our team using GCP. The ideal candidate will have a deep understanding of cloud architecture, a proven track record of delivering cloud-based solutions, and experience with GCP technologies. You will work closely with technical teams and clients to ensure the successful deployment and optimization of cloud solutions. Responsibilities: Lead the design and architecture of GCP-based solutions, ensuring scalability, security, performance, and cost-efficiency. Collaborate with business stakeholders, engineering teams, and clients to understand technical requirements and translate them into cloud-based solutions. Provide thought leadership and strategic guidance on cloud technologies, best practices, and industry trends. Design and implement cloud-native applications, data platforms, and microservices on GCP. Ensure cloud solutions are aligned with clients business goals and requirements, with a focus on automation and optimization. Conduct cloud assessments, identifying areas for improvement, migration strategies, and cost-saving opportunities. Oversee and manage the implementation of GCP solutions, ensuring seamless deployment and operational success. Create detailed documentation of cloud architecture, deployment processes, and operational guidelines. Engage in pre-sales activities, including solution design, proof of concepts (PoCs), and presenting GCP solutions to clients. Ensure compliance with security and regulatory requirements in the cloud environment. Requirements: At least 2+ years of experience as a Cloud Architect or in a similar role with strong expertise in Google Cloud Platform. In-depth knowledge of GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud Functions, and networking. Experience with infrastructure-as-code tools such as Terraform Strong understanding of cloud security, identity management, and compliance frameworks (e.g., GDPR, HIPAA). Hands-on experience with GCP networking, IAM, and logging/monitoring tools (Cloud Monitoring, Cloud Logging). Strong experience in designing and deploying highly available, fault-tolerant, and scalable solutions. Proficiency in programming languages like Java, Golang. Experience with containerization and orchestration technologies such as Docker, Kubernetes, and GKE (Google Kubernetes Engine). Experience in cloud cost management and optimization using GCP tools. Thanks, Pratap
Posted 1 month ago
2 - 5 years
10 - 20 Lacs
Kolkata
Work from Office
Job Summary:- We are looking for an experienced and hands-on Flutter Developer to join our agile product engineering team. The ideal candidate will have a proven track record of building and shipping cross-platform mobile applications using Flutter for both Android and iOS. This is an exciting opportunity to work on a greenfield project with a focus on speed, quality, and user experience. Responsibilities:- Design and build high-quality mobile applications using Flutter and Dart. Implement intuitive, responsive UI components and ensure seamless UX across devices. Integrate with backend services via REST APIs and manage asynchronous data. Own the full development cycle: design, build, test, deploy, and maintain apps in production. Collaborate closely with designers, product owners, and backend engineers to deliver features. Troubleshoot and resolve performance, compatibility, and runtime issues. Participate in code reviews and contribute to continuous improvement practices. Requirements:- 3+ years of experience in mobile application development, with at least 2 years using Flutter. Strong understanding of mobile architecture and state management (e.g., BLoC, Provider, Riverpod). Experience in publishing and maintaining apps on both Google Play Store and Apple App Store. Solid command of Git for version control. Ability to work in a fast-paced, iterative development environment with minimal oversight. Strong debugging and problem-solving skills. Preferred Qualifications:- Experience working with Firebase (Auth, Firestore, Cloud Functions). Knowledge of native Android/iOS development and bridging techniques in Flutter. Familiarity with CI/CD tools and practices for mobile app delivery. Exposure to Agile/Scrum methodologies.
Posted 1 month ago
5 - 9 years
4 - 8 Lacs
Kolkata
Work from Office
We are looking for an experienced and motivated DevOps Engineer with 5 to 7 years of hands- on experience designing, implementing, and managing cloud infrastructure, particularly on Google Cloud Platform (GCP). The ideal candidate will have deep expertise in infrastructure, such as code (IaC), CI/CD pipelines, container orchestration, and cloud-native technologies. This role requires strong analytical skills, attention to detail, and a passion for optimizing cloud infrastructure performance and cost. Key Responsibilities Design, implement, and maintain scalable, reliable, and secure cloud infrastructure using Google Cloud Platform (GCP) services, including Compute Engine, Google Kubernetes Engine (GKE), Cloud Functions, Cloud Pub/Sub, BigQuery, and Cloud Storage. Build and manage CI/CD pipelines using GitHub, artifact repositories, and version control systems; enforce GitOps practices across environments. Leverage Docker, Kubernetes, and serverless architectures to support microservices and modern application deployments. Develop and manage Infrastructure as Code (IaC) using Terraform to automate environment provisioning. Implement observability tools like Prometheus, Grafana, and Google Cloud Monitoring for real-time system insights. Ensure best practices in cloud security, including IAM policies, encryption standards, and network security. Integrate and manage service mesh architectures such as Istio or Linkerd for secure and observable microservices communication. Troubleshoot and resolve infrastructure issues, ensuring high availability, disaster recovery, and performance optimization. Drive initiatives for cloud cost management and suggest optimization strategies for resource efficiency. Document technical architectures, processes, and procedures; ensure smooth knowledge transfer and operational readiness. Collaborate with cross-functional teams including Development, QA, Security, and Architecture teams to streamline deployment workflows. Preferred candidate profile 5+ years of DevOps/Cloud Engineering experience, with at least 3 years on GCP. Proficiency in Terraform, Docker, Kubernetes, and other DevOps toolchains. Strong experience with CI/CD tools, GitHub/GitLab, and artifact repositories. Deep understanding of cloud networking, VPCs, load balancing, firewalls, and VPNs. Expertise in monitoring and logging frameworks such as Prometheus, Grafana, and Stackdriver (Cloud Monitoring). Strong scripting skills in Python, Bash, or Go for automation tasks. Knowledge of data backup, high-availability systems, and disaster recovery strategies. Familiarity with service mesh technologies and microservices-based architecture. Excellent analytical, troubleshooting, and documentation skills. Effective communication and ability to work in a fast-paced, collaborative environment.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane