Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 9.0 years
14 - 20 Lacs
Pune
Work from Office
Strong proficiency in SQL (Structured Query Language)querying, manipulating and optimizing dataExperience in ETL development informatica Data warehousing, ADF,GCP,Databricks Extensive experience with popular ETL tools Required Candidate profile 2+ years of experience in Infomatica, complex SQL queries SQL: Oracle, MS SQL, Teradata, Netezza • ETL: Informatica Power Centre (Must) • Cloud: ADF OR Databricks OR GCP or Google Data Proc
Posted 3 weeks ago
8.0 - 13.0 years
20 - 30 Lacs
Noida, Hyderabad, Bengaluru
Hybrid
Position: MERN Stack / React Developer Relevant Experience: 8 Years+ Salary: Nego Location: Hyderabad/ Noida / Bangalore / Hybrid Front End UI Development Design, develop, test, and deploy scalable web applications using MERN stack / MEAN(React.js, Node.js, Angular , MongoDB) and Express.js. Collaborate with cross-functional teams to identify requirements and implement solutions that meet business needs. Ensure high-quality code by writing unit tests, integrating with databases (e.g., Snowflake), and implementing REST APIs. Participate in Agile development methodologies such as Scrum to deliver projects on time. Troubleshoot issues related to application performance, security, or functionality. Very Strong experience on UI Frameworks like Angular JS, Node JS, React JS, bootstrap. Hands on experience working on ReactJS + Python/Snowflake/AWS Very Strong experience on front end technologies using HTML5, CSS3, Bootstrap & SASS Strong visual design skills and familiarity with visual and interactive design trends. Strong experience in JavaScript/TypeScript & ReactJs / NodeJs /AngularJs Experience in unit testing code with JEST / enzyme / Jasmine / Mocha / Chai is desired Should have a deep understanding of designing UI on Cross-platform devices. Experience on relational database SQL databases. In-depth experience in responsive web design and development. Experience designing solutions Node JS, in-depth working knowledge of various design patterns and best practices are needed. In-depth knowledge and experience on web development including CSS, Javascript and HTML Strong knowledge Agile process and tools like Jira. Hands experience in Linux/Unix Platform with knowledge day to day routine commands. Java, REST API Programming Design and development is an added advantage. Strong knowledge on DevOps process with enterprise architecture. Work with SME and scrum manager directly to understand the business & functional requirement. Strong knowledge on the validation process and deployment strategy. Co-ordinates with various team on time with high quality deliverables.
Posted 3 weeks ago
8.0 - 12.0 years
35 - 45 Lacs
Noida
Hybrid
Role Overview We are looking for a hands-on and outcome-oriented Engineering Manager to lead the development and delivery of our mobile, web, and low-code applications . You will manage hybrid teams (in-house + T&M vendors), decide on the optimal development approach (high-code vs. low-code using Unify Apps), and own the end-to-end technical delivery of business-critical features and applications. Key Responsibilities Team & Delivery Leadership Lead engineering teams delivering across mobile (React Native), web (React), backend (Node.js, Spring Boot), and low-code (Unify Apps) platforms. Drive execution across both internal and T&M vendor teams, ensuring clarity, speed, and accountability. Coach internal engineers and ensure vendor output meets quality and delivery benchmarks. High-Code vs Low-Code Decisioning Evaluate requirements and decide on the right implementation path (Unify Apps vs. traditional code). Ensure low-code is leveraged for agility, while high-code is used for complex, scalable components. Maintain architectural alignment across both delivery tracks. Technical Oversight Ensure scalable and maintainable design using: Frontend : React Mobile : React Native Backend : Node.js, Spring Boot Low-Code : Unify Apps Data : Postgres, Snowflake Messaging : Kafka Infrastructure : AWS Drive adherence to standards, secure coding practices, and quality reviews. Vendor Management Oversee daily collaboration with T&M vendor teams across both high-code and low-code streams. Ensure timely delivery, quality, and knowledge handover from vendors. Track vendor KPIs and optimize team allocation as needed. Agile Execution & Collaboration Work closely with Product Managers, QA, Infra, and Security to deliver features aligned with business priorities. Run Agile ceremonies (sprint planning, standups, retros) and monitor delivery velocity. Maintain clear documentation and ensure traceability of work. Required Qualifications Bachelor's degree in Computer Science, Engineering, or related field. 812 years of software engineering experience with 3+ years managing delivery teams. Strong technical background in: React, React Native, Node.js, Spring Boot Low-code platforms especially Unify Apps Microservices, Kafka, AWS,Postgres, Snowflake Demonstrated ability to manage hybrid teams (internal + vendors) and full-cycle delivery. Preferred Skills Experience delivering both low-code and high-code applications at scale. Knowledge of DevOps practices, CI/CD, Git workflows, and observability. Strong planning, estimation, and communication skills. Experience working in high-availability or operational environments (e.g., QSR, retail, e-commerce). Success Metrics Delivery of high-quality applications on time across both high-code and low-code platforms. Reduction in turnaround time through effective use of Unify Apps. Improved collaboration between internal and external teams. System reliability and scalability aligned with business goals. High developer and stakeholder satisfaction.
Posted 3 weeks ago
6.0 - 11.0 years
6 - 15 Lacs
Pune
Work from Office
Role & responsibilities Design, build, and maintain scalable data pipelines using DBT and Airflow. Develop and optimize SQL queries and data models in Snowflake. Implement ETL/ELT workflows, ensuring data quality, performance, and reliability. Work with Python for data processing, automation, and integration tasks. Handle JSON data structures for data ingestion, transformation, and APIs. Leverage AWS services (e.g., S3, Lambda, Glue, Redshift) for cloud-based data solutions. Collaborate with data analysts, engineers, and business teams to deliver high-quality data products. Preferred candidate profile Strong expertise in SQL, Snowflake, and DBT for data modeling and transformation. Proficiency in Python and Airflow for workflow automation. Experience working with AWS cloud services. Ability to handle JSON data formats and integrate APIs. Strong problem-solving skills and experience in optimizing data pipelines
Posted 3 weeks ago
2.0 - 5.0 years
18 - 21 Lacs
Bengaluru
Work from Office
Overview Overview Annalect is currently seeking a Senior Data Engineer to join our Technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design and development of software products as well as research and evaluation of new technical solutions Responsibilities Designing, building, testing, and deploying data transfers across various cloud environments (Azure, GCP, AWS, Snowflake, etc). Developing data pipelines, monitoring, maintaining, and tuning. Write at-scale data transformations in SQL and Python. Perform code reviews and provide leadership and guidance to junior developers. Qualifications Curiosity in learning the business requirements that are driving the engineering requirements. Interest in new technologies and eagerness to bring those technologies and out of the box ideas to the team. 3+ years of SQL experience. 3+ years of professional Python experience. 3+ years of professional Linux experience. Preferred familiarity with Snowflake, AWS, GCP, Azure cloud environments. Intellectual curiosity and drive; self-starters will thrive in this position. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges. Additional Skills BS BS, MS or PhD in Computer Science, Engineering, or equivalent real-world experience. Experience with big data and/or infrastructure. Bonus for having experience in setting up Petabytes of data so they can be easily accessed. Understanding of data organization, ie partitioning, clustering, file sizes, file formats. Experience working with classical relational databases (Postgres, Mysql, MSSQL). Experience with Hadoop, Hive, Spark, Redshift, or other data processing tools (Lots of time will be spent building and optimizing transformations) Proven ability to independently execute projects from concept to implementation to launch and to maintain a live product. Perks of working at Annalect We have an incredibly fun, collaborative, and friendly environment, and often host social and learning activities such as game night, speaker series, and so much more! Halloween is a special day on our calendar since it is our Founding Day – we go all out with decorations, costumes, and prizes! Generous vacation policy. Paid time off (PTO) includes vacation days, personal days, and a Summer Friday program. Extended time off around the holiday season. Our office is closed between Xmas and New Year to encourage our hardworking employees to rest, recharge and celebrate the season with family and friends. As part of Omnicom, we have the backing and resources of a global billion-dollar company, but also have the flexibility and pace of a “startup” - we move fast, break things, and innovate. Work with modern stack and environment to keep on learning and improving helping to experiment and shape latest technologies
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
New Delhi, Chennai, Bengaluru
Work from Office
We are seeking a highly experienced Data Modeler with expertise in Data Modelling, Data Analysis, and Dimensional Modelling The ideal candidate should have hands-on experience with Erwin or Erwin Studio, Data Warehousing (DWH), Snowflake, and SQL The role involves designing and developing data models to support business intelligence and analytics solutions while ensuring data integrity, consistency, and compliance with Banking domain standards Responsibilities include working with Snowflake to optimize cloud-based data models, executing complex SQL queries for data analysis, and resolving data quality issues The candidate should have strong analytical and problem-solving skills, prior experience in the Banking domain, and the ability to work independently in a remote environment
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled Senior Data Engineer to join our dynamic team in Bangalore. You will design, develop, and maintain scalable data ingestion frameworks and ELT pipelines using tools such as DBT, Apache Airflow, and Prefect. The ideal candidate will have deep technical expertise in cloud platforms (especially AWS), data architecture, and orchestration tools. You will work with modern cloud data warehouses like Snowflake, Redshift, or Databricks and integrate pipelines with AWS services such as S3, Lambda, Step Functions, and Glue. A strong background in SQL, scripting, and CI/CD practices is essential. Experience with data systems in manufacturing is a plus.
Posted 3 weeks ago
5.0 - 10.0 years
6 - 10 Lacs
Pune
Work from Office
Job Description: We, at Jet2 (UK’s third largest airlines and the largest tour operator), have set up a state-of-the-art Technology and Innovation Centre in Pune, India. The Data Visualisation Developer will join our growing Data Visualisation team with delivering impactful data visualisation projects (using Tableau). The team currently works with a range of departments including Pricing & Revenue, Overseas Operations and Contact Centre. This new role provides a fantastic opportunity to influence key business decisions through data visualisation. You will work closely with the Jet2 Travel Technology visualisation team, whilst working alongside Data Engineers, Data Scientists and Business Analysts to help business users get the most insight out of their data. You will also support our growing internal community of Tableau users through engagement activities and support inbox queries, that develop their visualisation knowledge and data fluency. Roles and Responsibilities What you’ll be doing: The successful candidate will work independently on data visualisation projects with guidance from the Jet2TT Data Visualisation Team Lead, the incumbent is expected to operate out of Pune location and collaborate with various stakeholders in Pune, Leeds, and Sheffield. Create impactful data visualisations and dashboards using Tableau Desktop / Cloud. Working with Business Analysts and Product Owners to understand requirements. Presenting visualisations to stakeholders. Teaching colleagues about new Tableau features and visualisation best practices Governance and monitoring of users and content on Tableau Cloud, including permissions management. Management of Tableau Support inbox via Outlook. What you’ll have Extensive experience in the use of Tableau, preferably evidenced by a strong Tableau Public portfolio. Comfortable presenting data visualisations and dashboards. Strong communication skills – Written & Verbal Knowledge of data visualisation best practices. SQL experience is desirable, but not essential. Working with cloud-based data technologies (e.g. Snowflake, Google BigQuery or similar) is desirable, but not essential. Experience of working in Agile Scrum framework to deliver high quality solutions. Experience of working with people from different geographies particularly UK & US
Posted 3 weeks ago
10.0 - 15.0 years
11 - 15 Lacs
Pune
Work from Office
Job Description: We, at Jet2 (UK’s third largest airlines and the largest tour operator), have set up a state-of-the-art Technology and Innovation Centre in Pune, India. The Lead Visualisation Developer will join our growing Data Visualisation team with delivering impactful data visualisation projects (using Tableau) whilst leading the Jet2TT visualisation function. The team currently works with a range of departments including Pricing & Revenue, Overseas Operations and Contact Centre. This new role provides a fantastic opportunity to represent visualisation to influence key business decisions. As part of the wider Data function, you will be working alongside Data Engineers, Data Scientists and Business Analysts to understand and gather requirements. In the role, you will be scoping visualisation projects, to deliver or delegate to members of the team, ensuring they have everything need to start development whilst guiding them through visualisation delivery. You will also support our visualisation Enablement team by supporting with the release of new Tableau features. Roles and Responsibilities What you’ll be doing: The successful candidate will work independently on data visualisation projects with zero or minimal guidance, the incumbent is expected to operate out of Pune location and collaborate with various stakeholders in Pune, Leeds, and Sheffield. Representing visualisation during project scoping. Working with Business Analysts and Product Owners to understand and scope requirements. Working with Data Engineers and Architects to ensure data models are fit visualisation. Developing Tableau dashboards from start to finish, using Tableau Desktop / Cloud – from gathering requirements, designing dashboards, and presenting to internal stakeholders. Presenting visualisations to stakeholders. Supporting and guiding members of the team through visualisation delivery. Supporting feature releases for Tableau. Teaching colleagues about new Tableau features and visualisation best practices. What you’ll have Extensive experience in the use of Tableau, evidenced by a strong Tableau Public portfolio. Expertise in the delivery of data visualisation Experience in r equirements gathering and presenting visualisations to internal stakeholders. Strong understanding of data visualisation best practices Experience of working in Agile Scrum framework to deliver high quality solutions. Strong communication skills – Written & Verbal Knowledge of the delivery of Data Engineering and Data Warehousing to Cloud Platforms. Knowledge of or exposure to Cloud Data Warehouse platforms (Snowflake Preferred) Knowledge and experience of working with a variety of databases (e.g., SQL).
Posted 3 weeks ago
4.0 - 9.0 years
4 - 8 Lacs
Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title : Snowflake Developer. Qualification :Any Graduate or Above Total Experience :3 to 12 years Location : Bangalore, Hyderabad, Gurgaon, PUNE, Kochi, Noida, Bhubaneshwar. Must Have Skills : 312 years of experience in Data Warehousing and ETL development. 2+ years of hands-on experience with Snowflake . Strong proficiency in SQL and performance tuning. Solid understanding of Snowflake features : Virtual Warehouses, Streams, Tasks, Time Travel, Cloning, and Security. Experience with ETL tools: Informatica, Talend, Matillion, Apache Nifi, etc. Experience with cloud platforms (AWS, Azure, or GCP). Familiarity with data modeling tools and techniques. Hands-on experience with scripting languages (Python preferred). Exposure to CI/CD pipelines and version control tools (Git). Roles and Responsibilities : Design and implement scalable and high-performing data warehouse solutions using Snowflake. Develop and optimize complex SQL queries and stored procedures. Migrate data from legacy systems to Snowflake using tools like Talend, Informatica, Apache Nifi, or custom ETL pipelines. Implement data modeling techniques (Star/Snowflake schemas) and data partitioning strategies. Ensure data quality, security, and governance best practices are followed. Monitor Snowflake performance and implement tuning solutions. Integrate Snowflake with various data sources (on-premise and cloud-based). Collaborate with BI developers, analysts, and other stakeholders for data requirements gathering and reporting needs. Automate data loading, transformation, and orchestration using tools like DBT, Airflow, or native Snowflake features (Tasks, Streams, etc.). Document technical designs, workflows, and business logic. Notice period : 30,60 and 90 days Mode of Work :WFO(Work From Office) -- Thanks & Regards, Narmadha S Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:8067432451
Posted 3 weeks ago
8.0 - 12.0 years
12 - 22 Lacs
Hyderabad
Work from Office
We are seeking a highly experienced and self-driven Senior Data Engineer to design, build, and optimize modern data pipelines and infrastructure. This role requires deep expertise in Snowflake, DBT, Python, and cloud data ecosystems. You will play a critical role in enabling data-driven decision-making across the organization by ensuring the availability, quality, and integrity of data. Key Responsibilities: Design and implement robust, scalable, and efficient data pipelines using ETL/ELT frameworks. Develop and manage data models and data warehouse architecture within Snowflake . Create and maintain DBT models for transformation, lineage tracking, and documentation. Write modular, reusable, and optimized Python scripts for data ingestion, transformation, and automation. Collaborate closely with data analysts, data scientists, and business teams to gather and fulfill data requirements. Ensure data integrity, consistency, and governance across all stages of the data lifecycle. Monitor pipeline performance and implement optimization strategies for queries and storage. Follow best practices for data engineering including version control (Git), testing, and CI/CD integration. Required Skills and Qualifications: 8+ years of experience in Data Engineering or related roles. Deep expertise in Snowflake : schema design, performance tuning, security, and access controls. Proficiency in Python , particularly for scripting, data transformation, and workflow automation. Strong understanding of data modeling techniques (e.g., star/snowflake schema, normalization). Proven experience with DBT for building modular, tested, and documented data pipelines. Familiarity with ETL/ELT tools and orchestration platforms like Apache Airflow or Prefect . Advanced SQL skills with experience handling large and complex data sets. Exposure to cloud platforms such as AWS , Azure , or GCP and their data services. Preferred Qualifications: Experience implementing data quality checks and governance frameworks. Understanding of modern data stack and CI/CD pipelines for data workflows. Contributions to data engineering best practices, open-source projects, or thought leadership.
Posted 3 weeks ago
4.0 - 8.0 years
18 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Python Data Engineer: 4+ years of experience in backend development with Python. Strong experience with AWS services and cloud architecture. Proficiency in developing RESTful APIs and microservices. Experience with database technologies such as SQL, PostgreSQL, and NoSQL databases. Familiarity with containerization and orchestration tools like Docker and Kubernetes. Knowledge of CI/CD pipelines and tools such as Jenkins, GitLab CI, or AWS CodePipeline. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills.
Posted 3 weeks ago
2.0 - 7.0 years
6 - 16 Lacs
Noida
Hybrid
AI Engineer JD (2+ years) Job Description Were searching for a hands-on AI engineer who can take modern LLMs from prototype to production, orchestrating multi-agent workflows using libraries such as LlamaIndex Workflows, LangGraph , and structured function calling . You should bring a solid foundation in classical ML (e.g., XGBoost) and deep learning with transformer-based models especially LLaMA and Qwen-family models along with experience in Retrieval-Augmented Generation (RAG) pipelines. A track record of building reliable, scalable systems is essential. Responsibilities: Design, build, and deploy ML/DL models for vision (YOLO), tabular (XGBoost),, and NLP / GenAI use-cases (function calling, RAG). Work on fine-tuning and deploying LLMs using platforms like Hugging Face and PyTorch. Engineer agent-based LLM solutions Compose multi-step, tool-calling workflows with LlamaIndex or LangGraph. Implement structured function calling and dynamic tool selection for Retrieval-Augmented Generation (RAG) pipelines. Orchestrate agent state, memory, and conversation context to solve complex user tasks. Fine-tune and serve LLMs on Hugging Face / PyTorch, including efficient-tuning methods (LoRA, QLoRA, PEFT). Operate on cloud (Azure preferred, AWS acceptable) set up training jobs, GPU/ACI deployments, CI/CD, observability, and cost governance. Collaborate cross-functionally with product, data, and frontend teams to turn fuzzy ideas into measurable impact. Build FastAPI micro-services for low-latency inference, streaming responses, and secure integration with downstream systems. Requirements: Proficiency in Python, with exposure to FastAPI and/or Java. Solid understanding and practical experience in Machine Learning with models like XGBoost. Experience with Deep Learning using YOLO, OCR frameworks, and PyTorch. NLP / GenAI: LLM fine-tuning, prompt engineering, RAG design. Hugging Face Transformers, PEFT, vector databases. Experience implementing agent frameworks (LlamaIndex, LangGraph, LangChain Agents) and function-calling patterns. MLOps: Docker, CI/CD, experiment tracking, model versioning, monitoring, and rollback strategies. Cloud: Azure ML / Azure Functions / AKS (preferred) or AWS SageMaker / Lambda basics. Bonus: experience with Triton / vLLM, streaming websockets, or GPU cost-optimization. Benefits of Working with Us: Best of Both Worlds: Enjoy the enthusiasm and learning curve of a startup combined with the deliveries and performance of an enterprise service provider. Flexible Working Hours: We offer a delivery-oriented approach with flexible working hours to help you maintain a healthy work-life balance. Limitless Growth Opportunities: The sky is not the limit when it comes to learning, growth, and sharing ideas. We encourage continuous learning and personal development. Flat Organizational Structure: We don't follow the typical corporate hierarchy ladder, fostering an open and collaborative work environment where everyone's voice is heard. As part of our dedication to an inclusive and diverse workforce, TechChefz Digital is committed to Equal Employment Opportunity without regard to race, color, national origin, ethnicity, gender, protected veteran status, disability, sexual orientation, gender identity, or religion. If you need assistance, you may contact us at joinus@techchefz.com
Posted 3 weeks ago
4.0 - 6.0 years
12 - 18 Lacs
Pune
Hybrid
Administer and manage Snowflake environments: Oversee user access, security, and performance tuning. Develop and optimize SQL queries: Create and refine complex SQL queries for data extraction, transformation, and loading (ETL) processes. Implement and maintain data pipelines: Use Python and integrate them with Snowflake. Monitor and troubleshoot: Ensure the smooth operation of Snowflake environments, identifying and resolving issues promptly. Collaborate with data engineers: Work closely with data engineers to provide optimized solutions and best practices. Review roles hierarchy: Provide recommendations for best practices in role hierarchy and security. Experience: Minimum of 3 years as a Snowflake Administrator, with a total of 5+ years in database administration or data engineering. Technical Skills: Proficiency in SQL, Python, and experience with performance tuning and optimization. Cloud Services: Experience with cloud platforms such as Azure. Data Warehousing: Strong understanding of data warehousing concepts and ETL processes. Problem-Solving: Excellent analytical and problem-solving skills.
Posted 3 weeks ago
7.0 - 20.0 years
10 - 40 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Roles and Responsibilities : Lead the development of data warehousing solutions using Snowflake, ensuring timely delivery and high-quality results. Collaborate with cross-functional teams to design, develop, test, and deploy ETL processes for large-scale data migration projects. Provide technical guidance and mentorship to junior team members on best practices in data modeling, query optimization, and performance tuning. Ensure compliance with industry standards and company policies regarding data security, privacy, and governance. Job Requirements : 7-20 years of experience in Data Warehousing/Business Intelligence domain with expertise in Snowflake technology. Strong understanding of building complex ETL processes using various tools such as Informatica PowerCenter or similar technologies. Experience working with large datasets (Terabytes) and ability to optimize queries for improved performance. Proven track record of leading teams or managing multiple projects simultaneously.
Posted 3 weeks ago
5.0 - 8.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Looking for Use Case Specialists with 5+ years of experience to implement use cases in data systems using Snowflake, Power BI, WhereScape, and DataVault 2.0. Strong stakeholder communication is essential. Required Candidate profile Experienced data professional skilled in Snowflake, Power BI, and WhereScape. Strong in implementing data use cases, stakeholder communication, and analytical problem-solving.
Posted 3 weeks ago
5.0 - 8.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Seeking a Data Acquisition Specialist with 5+ years of experience in Python, Azure Data Factory, and Snowflake. Should have hands-on expertise with WhereScape, DataVault 2.0, and data engineering best practices. Required Candidate profile Experienced Data Engineer skilled in Python, Azure Data Factory, Snowflake, and DataVault. Strong in data pipeline development, system integration, and business requirement analysis.
Posted 3 weeks ago
10.0 - 12.0 years
16 - 20 Lacs
Bengaluru
Work from Office
About the Job The Data & AI team is a highly focused effort to lead digital-first execution and transformation at Red Hat leveraging data strategically for our customers, partners, and associates. Radical CollaborationThere is no work done in isolation, as such, each team has to strive to collaborate with teams within the group, cross-group, and the communities. You will strive to make these collaborations as seamless as possible using tools, processes, best practices, and your own brand of creative problem-solving. Continuous LearningThis is a fast paced team and you are expected to be continuously curious, have a can do attitude and be proficient in understanding multiple aspects of the business, continuous improving your skill sets (technical and business) as the industry progresses Data and AI team is looking for a Engineering Manager to lead the Platform practice for the next generation SaaS based data and AI products. You will interact with product managers, Red Hat Sales, Marketing, Finance teams and data platform and product engineers to deliver a sophisticated data as-a-service platform. You'll coach and develop software engineers as they build the Platform, Infrastructure-as-code components, platform observability, agentic AI capabilities and other software to autonomously manage the environment, and guide problem management resolution (PMR) analysis when things go wrong. Youll work in a fast-paced globally distributed team while quickly learning new skills and creating ways to consistently meet service-level agreements (SLAs) for our data products.This role requires a leader with a proven record of navigating the complexities of working across multiple organizations, helping define and gain consensus on strategy and direction, and aligning the team(s) toward those end goals. What you will do Support engineering team to foster and deliver in an inner-source manner Develop, and retain a team of engineers developing and operating Red Hats data-as-service platform Coach engineers on good engineering principleswriting good code, automation, observability, toil reduction, and root cause analysis Manage high-visibility project delivery, including estimation, schedule, risks, and dependencies Design processes and communication norms that facilitate coordination across a fast-moving, fast-growing, diverse global team Lead your team through frequent changes in organization, process, and technology commensurate with a high growth cloud service in a competitive market Participate in a periodic 24x7 management escalation on-call rotation What you will bring 10-12 years of hands on developing and maintaining software. 5+ years experience managing high performing engineering teams Previous software engineering experience delivering data products, applications, or services on cloud native or hybrid platforms Experience with Agile methodologies and working in a DevOps culture with continuous integration / continuous deliveries Ability to lead distributed, remote teams working across multiple time zones Ability to discuss complex technical issues with engineers, product managers, and less-technical stakeholders including customers and senior leaders Understand and collaborate with compliance teams to make sure Platform and Products are compliant as per regulation. Experience hiring and developing engineers Experience in communication with stakeholder and leadership The following are considered a plus: Experience with platforms like Kuberentes/OpenShift and Kubernetes Operators, Prometheus, Graphana etc Experience with Go and Python for developing scaling backend software. Experience with building full stack applications Knowledge of SaaS technologies like Snowflake, Fivetran, Astro etc. About Red Hat Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. Inclusion at Red Hat Red Hats culture is built on the open source principles of transparency, collaboration, and inclusion, where the best ideas can come from anywhere and anyone. When this is realized, it empowers people from different backgrounds, perspectives, and experiences to come together to share ideas, challenge the status quo, and drive innovation. Our aspiration is that everyone experiences this culture with equal opportunity and access, and that all voices are not only heard but also celebrated. We hope you will join our celebration, and we welcome and encourage applicants from all the beautiful dimensions that compose our global village. Equal Opportunity Policy (EEO) Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law. Red Hat supports individuals with disabilities and provides reasonable accommodations to job applicants. If you need assistance completing our online job application, email application-assistance@redhat.com . General inquiries, such as those regarding the status of a job application, will not receive a reply.
Posted 3 weeks ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Interested can also apply with Sanjeevan Natarajan - 94866 21923 sanjeevan.natarajan@careernet.in Role & responsibilities Technical Leadership Lead a team of data engineers and developers; define technical strategy, best practices, and architecture for data platforms. End-to-End Solution Ownership Architect, develop, and manage scalable, secure, and high-performing data solutions on AWS and Databricks. Data Pipeline Strategy Oversee the design and development of robust data pipelines for ingestion, transformation, and storage of large-scale datasets. Data Governance & Quality Enforce data validation, lineage, and quality checks across the data lifecycle. Define standards for metadata, cataloging, and governance. Orchestration & Automation Design automated workflows using Airflow, Databricks Jobs/APIs, and other orchestration tools for end-to-end data operations. Cloud Cost & Performance Optimization Implement performance tuning strategies, cost optimization best practices, and efficient cluster configurations on AWS/Databricks. Security & Compliance Define and enforce data security standards, IAM policies, and compliance with industry-specific regulatory frameworks. Collaboration & Stakeholder Engagement Work closely with business users, analysts, and data scientists to translate requirements into scalable technical solutions. Migration Leadership Drive strategic data migrations from on-prem/legacy systems to cloud-native platforms with minimal risk and downtime. Mentorship & Growth Mentor junior engineers, contribute to talent development, and ensure continuous learning within the team. Preferred candidate profile Python , SQL , PySpark , Databricks , AWS (Mandatory) Leadership Experience in Data Engineering/Architecture Added Advantage: Experience in Life Sciences / Pharma
Posted 3 weeks ago
10.0 - 15.0 years
1 - 1 Lacs
Bengaluru
Remote
Highly experienced Data Architect / Data Modeler to design and govern the data architecture of a critical MI platform. Requires deep expertise in data modeling, data integration, and cloud-based data platforms.
Posted 3 weeks ago
6.0 - 9.0 years
0 - 3 Lacs
Pune, Chennai, Bengaluru
Hybrid
8+ years of overall development experience, with at least 5+ years specifically in IICS and Snowflakes. Implement scalable and efficient data storage solutions using Snowflake Strong experience with Informatica IICS, including tasks, mappings, and workflows Develop and manage IICS integrations to connect Snowflake with other enterprise systems.
Posted 3 weeks ago
4.0 - 7.0 years
3 - 5 Lacs
Pune
Work from Office
Position: SQL Developer Employment Type: Full Time Location: Pune, India Salary: TBC Work Experience: Applicant for this position should have 4+ years working as a SQL developer. Project Overview: The project will use a number of Microsoft SQL Server technologies and include development and maintenance of reports, APIs and other integrations with external financial systems. The successful applicant will liaise with other members of the team and will be expected to work on projects where they are the sole developer as well as part of a team on larger projects. The applicant will report to the SQL Development Manager Job Description: Ability to understand requirements clearly and communicate technical ideas to both technical stakeholders and business end users. Investigate and resolve issues quickly. Communication with end users. Working closely with other team members to understand business requirements. Complete structure analysis and systematic testing of the data. Skills: Microsoft SQL Server 2016 2022. T-SQL programming (4+ years) experience. Query/Stored Procedure performance tuning. SQL Server Integration Service. SQL Server Reporting Services. Experience in database design. Experience with source control. Knowledge of software engineering life cycle. Previous experience in designing, developing, testing, implementing and supporting software. 3rd Level IT Qualification. SQL MSCA or MSCE preferable. Knowledge of data technologies such as SnowFlake, Airflow, ADF desirable Other skills Ability to work on own initiative and as part of a team. Excellent time management and decision making skills. Excellent communication skills in both English written and verbal. Background in the financial industry preferable. Academic Qualification: Any graduation or post graduate. Any specialization in IT.
Posted 3 weeks ago
8.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
• Design, develop, and manage data solutions in an enterprise environment. • Lead the development and optimization of data pipelines and data models within Snowflake. • Work with Python and SQL to build scalable data processing solutions. • Design and implement data warehouses in Snowflake, ensuring efficient data organization and retrieval. • Develop production-ready data ingestion and processing pipelines using Spark, Scala, and Python. • Implement and manage orchestration tools such as Airflow, Informatica, or Automic. • Ensure data governance, metadata management, and data lineage tracking. • Utilize SQL analytical functions to support advanced data analytics. • Develop scripts using Shell scripting and JavaScript for automation and optimization. • Follow CI/CD, automated testing, and performance engineering best practices. • Work with Git, Confluence, and Jira for version control, documentation, and project tracking. • Troubleshoot and resolve data-related issues, ensuring system reliability and efficiency. • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders.
Posted 3 weeks ago
7.0 - 12.0 years
18 - 30 Lacs
Hyderabad
Work from Office
10+ yrs managing Data apps in Java based tech.,good exposure to cloud & web deve. 5+ yrs in customization implementation of IBM MDM AE & SE Core Java AWS S3 & Google Cloud Storage SQL batch processing Realtime API Services CI/CD pipelines
Posted 3 weeks ago
0.0 - 2.0 years
5 - 15 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer Experience: 12 to 20 months Work Mode: Work from Office Locations: Bangalore, Chennai, Kolkata, Pune, Gurgaon About Tredence Tredence focuses on last-mile delivery of powerful insights into profitable actions by uniting its strengths in business analytics, data science, and software engineering. The largest companies across industries are engaging with us and deploying their prediction and optimization solutions at scale. Headquartered in the San Francisco Bay Area, we serve clients in the US, Canada, Europe, and Southeast Asia. Tredence is an equal opportunity employer. We celebrate and support diversity and are committed to creating an inclusive environment for all employees. Visit our website for more details: Role Overview We are seeking a driven and hands-on Data Engineer with 12 to 20 months of experience to support modern data pipeline development and transformation initiatives. The role requires solid technical skills in SQL , Python , and PySpark , with exposure to cloud platforms such as Azure or GCP . As a Data Engineer at Tredence , you will work on ingesting, processing, and modeling large-scale data, implementing scalable data pipelines, and applying foundational data warehousing principles. This role also includes direct collaboration with cross-functional teams and client stakeholders. Key Responsibilities Develop robust and scalable data pipelines using PySpark in cloud platforms like Azure Databricks or GCP Dataflow . Write optimized SQL queries for data transformation, analysis, and validation. Implement and support data warehouse models and principles, including: Fact and Dimension modeling Star and Snowflake schemas Slowly Changing Dimensions (SCD) Change Data Capture (CDC) Medallion Architecture Monitor, troubleshoot, and improve pipeline performance and data quality. Work with teams across analytics, business, and IT functions to deliver data-driven solutions. Communicate technical updates and contribute to sprint-level delivery. Mandatory Skills Strong hands-on experience with SQL and Python Working knowledge of PySpark for data transformation Exposure to at least one cloud platform: Azure or GCP . Good understanding of data engineering and warehousing fundamentals Excellent debugging and problem-solving skills Strong written and verbal communication skills Preferred Skills Experience working with Databricks Community Edition or enterprise version Familiarity with data orchestration tools like Airflow or Azure Data Factory Exposure to CI/CD processes and version control (e.g., Git) Understanding of Agile/Scrum methodology and collaborative development Basic knowledge of handling structured and semi-structured data (JSON, Parquet, etc.) Required Skills Azure Databricks / GCP Python SQL Pyspark
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France