Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
6 - 15 Lacs
Gurugram
Work from Office
Requirements Elicitation, Understanding, Analysis, & Management Understand the project's Vision and requirements, and contribute to the creation of the supplemental requirements, building the low-level technical specifications for a particular platform and/or service solution. Project Planning, Tracking, & Reporting Estimate the tasks and resources required to design, create (build), and test the code for assigned module(s). Provide inputs in creating the detailed schedule for the project. Support the team in project planning activities, in evaluating risks, and shuffle priorities based on unresolved issues. During development and testing, ensure that assigned parts of the project/modules are on track with respect to schedules and quality. Note scope changes within the assigned modules and work with the team to shuffle priorities accordingly. Communicate regularly with the team about development changes, scheduling, and status. Participate in project review meetings. Tracking and reporting progress for assigned modules Design: Create a detailed (LLD) design for the assigned piece(s) with possible alternate solutions. Ensure that LLD design meets business requirements. Submit the LLD design for review. Fix the detailed (LLD) design for the assigned piece(s) for the comments received from team. Development & Support Build the code of high-priority and complex systems according to the functional specifications, detailed design, maintainability, and coding and efficiency standards. Use code management processes and tools to avoid versioning problems. Ensure that the code does not affect the functioning of any external or internal systems. Perform peer reviews of code to ensure it meets coding and efficiency standards. Act as the primary reviewer to review the application code created by software engineers to ensure compliance to defined standards. Recommend changes to the code as required. Testing & Debugging Attend the Test Design walkthroughs to help verify that the plans and conditions will test all functions and features effectively. Perform impact analysis for issues assigned to self and software engineers. Actively assist with project- and code-level problem solving, such as suggesting paths to explore when testing engineers or software engineers encounter a debugging problem, and escalate urgent issues. Documentation Review technical documentation for the code for accuracy, completeness, and usability. Document and maintain the reviews conducted and the unit test results. Process Management Adhere to the project and support processes. Adhere to best practices and comply with approved policies, procedures, and methodologies, such as the SDLC cycle for different project sizes. Shows responsibility for corporate funds, materials and resources. Ensure adherence to SDLC and audits requirements. Adhere to best practices and comply with approved policies, procedures, and methodologies. Position Summary As a Lead Collaboration Engineer at Guardian Life Insurance, you will be responsible for designing, building, testing, deploying, and supporting Microsoft 365 collaboration capabilities for 16,000 users globally. You are Excellent problem solver Strong collaborator with team members and other teams Strong communicator, documenter, and presenter Strong project ownership and execution skills, ensuring timely and quality delivery. Continuous self-learner, subject matter expert for Microsoft 365 You have Bachelor’s degree in computer science, Information Technology, or significant relevant experience 5+ years of experience, preferably in a large financial services enterprise Expert-level experience with Microsoft 365: Administration, Outlook/Exchange Online/Exchange Server, Teams, SharePoint Online/OneDrive, Power Automate, Viva Engage (Yammer), Stream, PowerShell scripting, advanced troubleshooting diagnostics, Copilot, Word, Excel, PowerPoint, OneNote, Visio, Project, Whiteboard, To Do, Planner, Lists, Viva Insights, Power Apps, Loop, Azure. Intermediate-level experience with Proofpoint E-mail Protection or a similar e-mail security service – Administration, Routing, Allow/Block List, Encryption, DLP, Send Securely, Secure Portal, SPF/DKIM/DMARC, delivery troubleshooting, incident response. Knowledge of other complimentary collaboration applications are desired: Zoom, BitTitan MigrationWiz, or ShareGate. Strong knowledge of IT Service Management and ITIL, preferably using Service Now – Incidents, Tasks, Problems, Knowledge, CMDB, Reporting, Dashboards. Proven ability to manage support and request tickets within SLAs, and drive Microsoft support cases to closure. Knowledge of Project Management using waterfall and agile frameworks. Proven ability to complete projects reliably and with quality. Knowledge of Networking and Security - DNS, Active Directory, Entra ID (Azure AD) including conditional access policies, certificates, firewalls, proxies, cloud access security brokers (CASB), single sign on (SSO), multi-factor authentication (MFA), data loss prevention (DLP) and identity and access management (IAM). Knowledge of Endpoints, Servers, and Cloud – Devices, operating systems, browsers, Intune, System Center, Nexthink, Amazon AWS, Azure. Microsoft certifications are desired, preferably MS-900, MS-700, MS-721, MS-102 You will Deliver excellent support for Collaboration capabilities to achieve service level agreements. Participation in the team on-call support rotation is required. Design, build, test, and deploy new Collaboration capabilities to achieve strategic goals and key deliverables reliably and with quality. Current goals are focused on Copilot, and Service Improvements. Reporting Relationships As our Collaboration Engineer, you will administratively report to our Delivery Manager/ Head of IT who reports to our Head of Infrastructure IT; and functionally to the Head of Collaboration Technology. Location: This position can be based in any of the following locations: Gurgaon For internal use only: R000106866
Posted 1 week ago
12.0 - 15.0 years
55 - 60 Lacs
Ahmedabad, Chennai, Bengaluru
Work from Office
Dear Candidate, We are hiring a Data Platform Engineer to build and maintain scalable, secure, and reliable data infrastructure for analytics and real-time processing. Key Responsibilities: Design and manage data pipelines, storage layers, and ingestion frameworks. Build platforms for batch and streaming data processing (Spark, Kafka, Flink). Optimize data systems for scalability, fault tolerance, and performance. Collaborate with data engineers, analysts, and DevOps to enable data access. Enforce data governance, access controls, and compliance standards. Required Skills & Qualifications: Proficiency with distributed data systems (Hadoop, Spark, Kafka, Airflow). Strong SQL and experience with cloud data platforms (Snowflake, BigQuery, Redshift). Knowledge of data warehousing, lakehouse, and ETL/ELT pipelines. Experience with infrastructure as code and automation. Familiarity with data quality, security, and metadata management. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 2 weeks ago
8.0 - 13.0 years
25 - 40 Lacs
Chennai
Work from Office
Architect & Build Scalable Systems: Design and implement a petabyte-scale lakehouse Architectures to unify data lakes and warehouses. Real-Time Data Engineering: Develop and optimize streaming pipelines using Kafka, Pulsar, and Flink. Required Candidate profile Data engineering experience with large-scale systems• Expert proficiency in Java for data-intensive applications. Handson experience with lakehouse architectures, stream processing, & event streaming
Posted 2 weeks ago
9.0 - 14.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Job Description: SQL & Database Management: Deep knowledge of relational databases (PostgreSQL), cloud-hosted data platforms (AWS, Azure, GCP), and data warehouses like Snowflake . ETL/ELT Tools: Experience with SnapLogic, StreamSets, or DBT for building and maintaining data pipelines. / ETL Tools Extensive Experience on data Pipelines Data Modeling & Optimization: Strong understanding of data modeling, OLAP systems, query optimization, and performance tuning. Cloud & Security: Familiarity with cloud platforms and SQL security techniques (e.g., data encryption, TDE). Data Warehousing: Experience managing large datasets, data marts, and optimizing databases for performance. Agile & CI/CD: Knowledge of Agile methodologies and CI/CD automation tools. Role & responsibilities Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud database technologies. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data needs. Work with data and analytics experts to strive for greater functionality in our data systems. Assemble large, complex data sets that meet functional / non-functional business requirements. – Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code. – Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability, etc. – Unit Test databases and perform bug fixes. – Develop best practices for database design and development activities. – Take on technical leadership responsibilities of database projects across various scrum teams. Manage exploratory data analysis to support dashboard development (desirable) Required Skills: – Strong experience in SQL with expertise in relational database(PostgreSQL preferrable cloud hosted in AWS/Azure/GCP) or any cloud-based Data Warehouse (like Snowflake, Azure Synapse). – Competence in data preparation and/or ETL/ELT tools like SnapLogic, StreamSets, DBT, etc. (preferably strong working experience in one or more) to build and maintain complex data pipelines and flows to handle large volume of data. – Understanding of data modelling techniques and working knowledge with OLAP systems – Deep knowledge of databases, data marts, data warehouse enterprise systems and handling of large datasets. – In-depth knowledge of ingestion techniques, data cleaning, de-dupe, etc. – Ability to fine tune report generating queries. – Solid understanding of normalization and denormalization of data, database exception handling, profiling queries, performance counters, debugging, database & query optimization techniques. – Understanding of index design and performance-tuning techniques – Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption(TDE), signed stored procedures, and assignment of user permissions – Experience in understanding the source data from various platforms and mapping them into Entity Relationship Models (ER) for data integration and reporting(desirable). – Adhere to standards for all database e.g., Data Models, Data Architecture and Naming Conventions – Exposure to Source control like GIT, Azure DevOps – Understanding of Agile methodologies (Scrum, Itanban) – experience with NoSQL database to migrate data into other type of databases with real time replication (desirable). – Experience with CI/CD automation tools (desirable) – Programming language experience in Golang, Python, any programming language, Visualization tools (Power BI/Tableau) (desirable).
Posted 2 weeks ago
6.0 - 10.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. AWS Data/API Gateway Pipeline Engineer responsible for designing, building, and maintaining real-time, serverless data pipelines and API services. This role requires extensive hands-on experience with Java, Python, Redis, DynamoDB Streams, and PostgreSQL, along with working knowledge of AWS Lambda and AWS Glue for data processing and orchestration. This position involves collaboration with architects, backend developers, and DevOps engineers to deliver scalable, event-driven data solutions and secure API services across cloud-native systems. Key Responsibilities API & Backend Engineering Build and deploy RESTful APIs using AWS API Gateway, Lambda, and Java and Python. Integrate backend APIs with Redis for low-latency caching and pub/sub messaging. Use PostgreSQL for structured data storage and transactional processing. Secure APIs using IAM, OAuth2, and JWT, and implement throttling and versioning strategies. Data Pipeline & Streaming Design and develop event-driven data pipelines using DynamoDB Streams to trigger downstream processing. Use AWS Glue to orchestrate ETL jobs for batch and semi-structured data workflows. Build and maintain Lambda functions to process real-time events and orchestrate data flows. Ensure data consistency and resilience across services, queues, and databases. Cloud Infrastructure & DevOps Deploy and manage cloud infrastructure using CloudFormation, Terraform, or AWS CDK. Monitor system health and service metrics using CloudWatch, SNS and structured logging. Contribute to CI/CD pipeline development for testing and deploying Lambda/API services. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor's degree in computer science, Engineering, or a related field. Over 6 years of experience in developing backend or data pipeline services using Java and Python . Strong hands-on experience with: AWS API Gateway , Lambda , DynamoDB Streams Redis (caching, messaging) PostgreSQL (schema design, tuning, SQL) AWS Glue for ETL jobs and data transformation Solid understanding of REST API design principles, serverless computing, and real-time architecture. Preferred Skills and Experience Familiarity with Kafka, Kinesis, or other message streaming systems Swagger/OpenAPI for API documentation Docker and Kubernetes (EKS) Git, CI/CD tools (e.g., GitHub Actions) Experience with asynchronous event processing, retries, and dead-letter queues (DLQs) Exposure to data lake architectures (S3, Glue Data Catalog, Athena) Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
5.0 - 7.0 years
35 - 55 Lacs
Bengaluru
Work from Office
Serko is a cutting-edge tech platform in global business travel & expense technology. When you join Serko, you become part of a team of passionate travellers and technologists bringing people together, using the world’s leading business travel marketplace. We are proud to be an equal opportunity employer, we embrace the richness of diversity, showing up authentically to create a positive impact. There's an exciting road ahead of us, where travel needs real, impactful change. With offices in New Zealand, Australia, North America, and China, we are thrilled to be expanding our global footprint, landing our new hub in Bengaluru, India. With rapid a growth plan in place for India, we’re hiring people from different backgrounds, experiences, abilities, and perspectives to help us build a world-class team and product. As a Senior Principal Engineer, you’ll play a key role in shaping our technical vision and driving engineering excellence across our product streams. Your leadership will foster a high-performance culture that empowers teams to build innovative solutions with real-world impact. Requirements Working closely with stream leadership—including the Head of Engineering, Senior Engineering Managers, Architects, and domain specialists—you’ll provide hands-on technical guidance and help solve complex engineering challenges. As a Senior Principal Engineer, you'll also lead targeted projects and prototypes, shaping new technical approaches and ensuring our practices stay ahead of the curve. What you'll do Champion best practices across engineering teams, embedding them deeply within the stream Proactively resolve coordination challenges within and across streams to keep teams aligned and unblocked Partner with Product Managers to ensure customer value is delivered in the most pragmatic and impactful way Lead or contribute to focused technical projects that solve high-priority problems Collaborate with cross-functional teams to define clear requirements, objectives, and timelines for key initiatives Explore innovative solutions through research and analysis, bringing fresh thinking to technical challenges Mentor engineers and share technical expertise to uplift team capability and growth Continuously evaluate and enhance system performance, reliability, and scalability Stay ahead of the curve by tracking industry trends, emerging technologies, and evolving best practices Drive continuous improvement across products and processes to boost quality, efficiency, and customer satisfaction Maintain strong communication with stakeholders to gather insights, provide updates, and incorporate feedback What you'll bring to the team Strong proficiency in stream-specific technologies, tool and programming languages Demonstrated expertise in specific areas of specialization related to the stream Excellent problem-solving skills and attention to detail Ability to lead teams through complex changes to engineering related areas, and maintain alignment across Product and Technology teams Effective communication and interpersonal skills Proven ability to work independently and collaboratively in a fast-paced environment Tertiary level qualification in a relevant Engineering discipline or equivalent. Benefits At Serko we aim to create a place where people can come and do their best work. This means you’ll be operating in an environment with great tools and support to enable you to perform at the highest level of your abilities, producing high-quality, and delivering innovative and efficient results. Our people are fully engaged, continuously improving, and encouraged to make an impact. Some of the benefits of working at Serko are: A competitive base pay Medical Benefits Discretionary incentive plan based on individual and company performance Focus on development: Access to a learning & development platform and opportunity for you to own your career pathways Flexible work policy. Apply Hit the ‘apply’ button now, or explore more about what it’s like to work at Serko and all our global opportunities at www.Serko.com .
Posted 3 weeks ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 1 month ago
7.0 - 10.0 years
15 - 22 Lacs
Pune
Work from Office
As an experienced member of our Core banking Base Development / Professional Service Group, you will be responsible for effective Microservice development in Scala and delivery of our NextGen transformation / professional services projects/programs. What You Will Do: • Adhere the processes followed for development in the program. • Report status, and proactively identify issues to the Tech Lead and management team. • Personal ownership and accountability for delivering assigned tasks and deliverables within the established schedule. • Facilitate a strong and supportive team environment that enables the team as well as individual team members to overcome any political, bureaucratic and/or resource barriers to participation. • Recommend and Implement solutions. Be totally hands on and have the ability to work independently. What You Will Need to Have: • 4 to 8 years of recent hands-on in Scala and Akka Framework • Technical Skillset required o Should possess Hands-on experience in Scala development including Akka Framework. o Must have good understanding on Akka Streams. o Test driven development. o Awareness on message broker. o Hands-on Experience in design and development of Microservices. o Good awareness on Event driven Microservices Architecture. o GRPC Protocol + Protocol Buffers. o Hands-on Experience in Docker Containers. o Hands-on Experience in Kubernetes. o Awareness on cloud native applications. o Jira, Confluence, Ansible, Terraform. o Good knowledge of the cloud platforms (preferably AWS), their IaaS, PaaS, SaaS solutions. o Good knowledge and hands on experience on the scripting languages like Batch, Bash, hands on experience on Python would be a plus. o Knowledge of Integration and unit test and Behavior Driven Development o Need to have good problem-solving skills. o Good communication skills. What Would Be Great to Have: • Experience integrating to third party applications. • Agile knowledge • Good understanding of the configuration management • Financial Industry and Core Banking integration experience --
Posted 1 month ago
6.0 - 11.0 years
4 - 9 Lacs
Bengaluru
Work from Office
SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.
Posted 1 month ago
5.0 - 10.0 years
14 - 19 Lacs
Bengaluru, Delhi / NCR, Mumbai (All Areas)
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC Exp - 5+ Years Location - Pan India Immediate Joiners only Snowflake developer , Pyspark , Python , API, CI/CD , Cloud services ,Azure , Azure Devops Subject: Fw : TMNA SNOWFLAKE POSITION Please share profiles for Snowflake developers having strong Pyspark experience Job Description: Strong hands-on experience in Snowflake development including Streams, Tasks, and Time Travel Deep understanding of Snowpark for Python and its application for data engineering workflows Proficient in PySpark , Spark SQL, and distributed data processing Experience with API development . Proficiency in cloud services (preferably Azure, but AWS/GCP also acceptable) Solid understanding of CI/CD practices and tools like Azure DevOps, GitHub Actions, GitLab, or Jenkins for snowflake. Knowledge of Delta Lake, Data Lakehouse principles, and schema evolution is a plus Preferred candidate profile
Posted 1 month ago
5 - 10 years
20 - 30 Lacs
Bengaluru
Work from Office
Purpose of the Job : The person who joins us as a Lead Product Engineer will work in the capacity of an individual contributor. He/ she will work closely with the Product Owner to deliver high quality responsive web applications. As part of the job, he/she is expected to prepare artefacts to document the design and code, conduct design and code reviews for work done by the team, and mentor the junior engineers in the team. Key Tasks • Develop original algorithms, logic and code, and ensure that it withstands any test • Understand the difference between creating a product and working on a turnkey project, and write code accordingly • Demonstrate significant abstraction skills to convert requirements into usable product features • Creating original algorithms and ideas to solve complex issues Educational Background • Bachelors+ degree in Computer Science engineering/ related fields • Agile Certification • Certifications in the Financial domain Preferred Experience • 10 to 12 years of software development experience developing web applications using Java ( 1.8+)/ J2EE technology • Expertise in the Object Oriented Programming paradigm and using standard frameworks like Spring MVC, Hibernate • Hands on experience in building UI applications using React or Angular AND Spring and Hibernate • Strong experience in data base design & Stored procedure development • Proven experience in performance tuning of both Online and Batch applications. • Expertise in agile development methodologies and DevOps practices including continuous integration, static code analysis etc. • Expertise in Test Driven development and experience in Rapid Prototyping and testing with Minimum Viable Products • Experience in Product Implementation • Experience of working in Financial industries and/ or product development organizations building financial products Key Skills and Competencies • Team work • Intellectual curiosity • Financial business acumen • Effective communication
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane