Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 16.0 years
14 - 20 Lacs
Pune
Work from Office
AI/ML/GenAI AWS SME Job Description Role Overview: An AWS SME with a Data Science Background is responsible for leveraging Amazon Web Services (AWS) to design, implement, and manage data-driven solutions. This role involves a combination of cloud computing expertise and data science skills to optimize and innovate business processes. Key Responsibilities: Data Analysis and Modelling: Analyzing large datasets to derive actionable insights and building predictive models using AWS services like SageMaker, Bedrock, Textract etc. Cloud Infrastructure Management: Designing, deploying, and maintaining scalable cloud infrastructure on AWS to support data science workflows. Machine Learning Implementation: Developing and deploying machine learning models using AWS ML services. Security and Compliance: Ensuring data security and compliance with industry standards and best practices. Collaboration: Working closely with cross-functional teams, including data engineers, analysts, DevOps and business stakeholders, to deliver data-driven solutions. Performance Optimization: Monitoring and optimizing the performance of data science applications and cloud infrastructure. Documentation and Reporting: Documenting processes, models, and results, and presenting findings to stakeholders. Skills & Qualifications Technical Skills: Proficiency in AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker). Strong programming skills in Python. Experience with AI/ML project life cycle steps. Knowledge of machine learning algorithms and frameworks (e.g., TensorFlow, Scikit-learn). Familiarity with data pipeline tools (e.g., AWS Glue, Apache Airflow). Excellent communication and collaboration abilities.
Posted 2 days ago
14.0 - 19.0 years
16 - 20 Lacs
Pune
Work from Office
Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you. UKG is looking to hire a Software Architect to lead in designing and delivering next generation and transformational architectural designs that yield superior functional, scalable, resilient, adaptable, performant, and secure offerings and component services at UKG. They will establish the starting point for transformational architectural decisions, patterns, and practices, with revolutionary architectural outcomes of performance, scale, security, and delivery that will shape how new software architecture patterns, technologies, and best practices are adopted and measured at UKG. As a Software Architect at UKG you will: Manage and deliver new architectural transformational designs, identify architecture risks, and maintain architectural opportunities and risk assessment for all stakeholders. Serve as subject matter expert for Software Architecture practice, with the ability to provide technical leadership in areas of Software Design, Development & Delivery, Reliability, Scalability, Performance, and Security across many Software domains such as but not limited to UI, APIs, Microservices, DDD, Platform Services, Data Engineering and Public Cloud. Contribute to the technical leadership for Product Architecture group to help other software architects envision, develop, and foster the adoption of new architectural transformational designs and implementations. Serve as Technical Ambassadors of goodwill for our internal Technical Community as well as the external Tech Industry and Academia communities. Partner with Product Owners, Engineering Owners when making roadmap, design, architectural, and engineering impacting decisions. Lead initiatives to effectively communicate and present the architectural decisions and technical strategies so that development teams properly understand why the strategies need to be adopted. Lead initiatives in development of architectural significant proofs-of-concept solutions to assist product architects and development teams in accelerating the adoption of the technical strategy. Lead technical due diligence activities and third-party partnership evaluation initiatives. Serve as technical strategic advisors to research work being executed in the Development organization. : 14+ years of Software Development experience and 5+ years of Software Architecture experience as well as 5+ years of technical leadership and architecture experience in software and cloud development (ideally in SaaS) 5+ years' experience designing and delivering large scale distributed systems in a multi-tenant SaaS environment 5+ years' experience building, managing, and leading architects and technical leads Expert understanding of security, reliability, scalability, high availability, and concurrency architectural patterns and solutions. Expert in solution design across the full technology stack, including for public and hybrid cloud deployments. Expert in patterns and solutions that enable evolutionary architectures, leveraging flexibility and creativity when balancing the present technologies with emerging ones when formulating new strategies. Influential speaker and an expert in designing and delivering presentations on large stages, Prior experience with at least one major IaaS and/or PaaS technology (OpenStack, AWS, GCP, Azure, Kubernetes, Cloud Foundry, etc.) Prior experience with agile development, Continuous Delivery, DevOps, and SRE practices Proficient in at least one static OO language (Java, Scala, C#) Proficient in at least one dynamic language (JavaScript/TypeScript, Python, Node.js) Proficient in current development tools (GitHub, Gitlab, CLI, Vim, JetBrains, Xamarin, Visual Studio, Concourse.ci, CircleCI, Jenkins) Qualifications: Bachelor's or Master's degree in Computer Science, Mathematics, or Engineering is preferred Prior experience technically leading at least one vertical software design practice, in depth such as Microservices Architecture, Public Cloud Architecture, Site Reliability Architecture, Data Engineering Architecture, or Software Security Prior experience with relational and non-relational database technologies (MySQL, MongoDB, Cassandra) Prior experience with messaging and event streaming solutions (Kafka, RabbitMQ, Kafka Streams, Spark) Prior experience with Workflow (Camunda, Activiti) and iPaaS solutions (MuleSoft, Dell Boomi) is a bonus Strong understanding of infrastructure and related technologies (compute, storage, networking) Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! in the Application and Interview Process UKGCareers@ukg.com
Posted 2 days ago
12.0 - 15.0 years
16 - 20 Lacs
Bengaluru
Work from Office
As a Principal Technical Specialist, you will lead teams in technical discussions, architecture strategy, and system design for high-performance applications. You will collaborate with architects and developers to define workflows, interfaces, and domain models, leveraging SQL and NoSQL databases. Your role requires a hands-on approach, strong communication skills, and the ability to translate complex technical concepts into actionable insights. You have: Bachelors degree in Engineering with 12 to 15 years of relevant work experience. Experience with architectural patterns and programming languages such as Java, Python/Shell, and Golang. Familiarity with frameworks like Spring, Guice, or Micronaut, and libraries such as Pandas, Keras, Pytorch, Scipy, and Numpy. Experience with Kafka, Spark, and databases (SQLPostgres, Oracle; NoSQLElastic, Prometheus, Mongo, Cassandra, Redis, and Pinot). It would be nice if you also had: Experience in OLTP/real-time system management in enterprise software. Experience in both large-scale and small-scale development, with the ability to model domain-specific systems. Expertise in data engineering, statistical analytics, and AI/ML techniques (AI experience is a plus). Knowledge of NMS/EMS for the telecom domain, including Network Operational Lifecycle and Network Planning. Lead and Guide the team in technical discussions. Expertise in Java, Python/Shell, and frameworks like Spring, Kafka, and AI/ML libraries will drive innovation in telecom network management and real-time enterprise software. Analyze and decompose requirements. Work on long lead items and define architecture strategy for the application. Work with architects and developers in other applications and define interfaces and workflows. With a strong foundation in statistical analytics, data engineering and AI/ML. Communicate and present to internal and external audiences.
Posted 2 days ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
The AI/Data Engineer will be responsible for designing, implementing, and maintaining scalable data solutions. This role will involve working with various data tools and technologies to ensure efficient data processing, integration, and visualization. Key Responsibilities: Develop and maintain data pipelines and ETL processes to ingest and transform data from multiple sources. Design, implement, and manage data models and databases using SQL. Utilize Python for data manipulation, analysis, and automation tasks. Administer and automate processes on Linux systems using shell scripting and tools like Putty. Schedule and monitor jobs using Control-M or similar scheduling tools. Create interactive dashboards and reports using Tableau and Power BI to support data-driven decision-making. Collaborate with data scientists and analysts to support AI/ML model deployment and integration. Ensure data quality, integrity, and security across all data processes. Utilize version control software, such as Git and Bitbucket, to manage and track code changes effectively. Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. At least 3 years of proven experience as a Data Engineer, AI Engineer, or similar role. Proficiency in Python and SQL.
Posted 2 days ago
3.0 - 5.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Project description We are more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a commitment to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It's how we've contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services - and our open-access model - we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. We are headquartered in the United Kingdom, with significant operations in 70 countries across EMEA, North America, Latin America and Asia Pacific. We employ 25,000 people globally, more than half located in Asia Pacific. ResponsibilitiesAs a Senior Quality Assurance Engineer, you will be responsible for ensuring the quality and reliability of complex data-driven systems, with a focus on financial services applications. You will work closely with Data Engineers, Business Analysts, and Developers across global teams to validate functionality, accuracy, and performance of software solutions, particularly around data migration from on-premises to cloud platforms. Key responsibilities include: Leading and executing end-to-end test plans, including functional, unit, regression, and back-to-back testing Designing test strategies for data migration projects, with a strong focus on Oracle to Cloud transitions Verifying data accuracy and transformation logic across multiple environments Writing Python-based automated test scripts and utilities for validation Participating in Agile ceremonies, collaborating closely with cross-functional teams Proactively identifying and documenting defects, inconsistencies, and process improvements Contributing to continuous testing and integration practices Ensuring traceability between requirements, test cases, and delivered code SkillsMust have Mandatory Skills Description The ideal candidate must demonstrate strong experience ( minimum 7 Years) and hands-on expertise in the following areas: Data Testing (Oracle to Cloud Migration)Deep understanding of testing strategies related to large-scale data movement and transformation validation between legacy on-premise systems and modern cloud platforms. Python ScriptingProficient in using Python for writing automated test scripts and tools to streamline testing processes. Regression TestingProven ability to develop and manage comprehensive regression test suites ensuring consistent software performance over releases. Back-to-Back TestingExperience in comparing results between old and new systems or components to validate data integrity post-migration. Functional TestingSkilled in verifying system behavior against functional requirements in a business-critical environment. Unit TestingCapable of writing and executing unit tests for small code components to ensure correctness at the foundational level. Nice to have While not required, the following skills would be a strong plus and would enhance your effectiveness in the role: Advanced Python DevelopmentExperience in building complex QA tools or contributing to CI/CD pipelines using Python. DBT (Data Build Tool)Familiarity with DBT for transformation testing and documentation in data engineering workflows. SnowflakeExposure to Snowflake cloud data warehouse and understanding of its testing and validation mechanisms. OtherLanguagesEnglishB1 Intermediate SenioritySenior
Posted 2 days ago
4.0 - 8.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project description We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Responsibilities We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Skills Must have Experience5+ years of industry experience as a Data Scientist, with a proven track record of delivering impactful, data-driven solutions. Programming Skills: Advanced proficiency in Python, with extensive experience writing clean, efficient, and maintainable code. Proficiency with version control tools such as Git. Data EngineeringStrong working proficiency with SQL and distributed computing with Apache Spark. Cloud PlatformsExperience building and deploying apps on Azure Cloud. Generative AI & LLMsPractical experience with large language models (e.g., OpenAI, Anthropic, HuggingFace). Knowledge of Retrieval-Augmented Generation (RAG) techniques and prompt engineering is expected. Machine Learning & ModelingStrong grasp of statistical modeling, machine learning algorithms, and tools like scikit-learn, XGBoost, etc. Stakeholder EngagementExcellent communication skills with a demonstrated ability to interact with business stakeholders, understand their needs, present technical insights clearly, and drive alignment across teams. Tools and librariesProficiency with libraries like Pandas, NumPy, and ML lifecycle tools such as MLflow. Team CollaborationProven experience contributing to agile teams and working cross-functionally in fast-paced environments. Nice to have Hands-on experience with Databricks and Snowflake. Hands-on experience building LLM-based applications using agentic frameworks like LangChain, LangGraph, and AutoGen. Familiarity with data visualization platforms such as Power BI, Tableau, or Plotly. Front-end/Full stack development experience. Exposure to MLOps practices and model deployment pipelines in production. OtherLanguagesEnglishC2 Proficient SeniorityRegular
Posted 2 days ago
3.0 - 6.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project description A DevOps Support Engineer will perform tasks related to data pipeline work and monitor and support related to job execution, data movement, and on-call support. In addition, deployed pipeline implementations will be tested for production validation. Responsibilities Provide production support for 1st tier, after hours and on call support. The candidate will eventually develop into more data engineering within the Network Operations team. The selected resource will learn the Telecommunications domain while also developing data learning skills. Skills Must have ETL pipeline, data engineering, data movement/monitoring Azure Databricks Watchtower Automation tools Testing Nice to have Data Engineering Other Languages EnglishC2 Proficient Seniority Regular
Posted 2 days ago
7.0 - 12.0 years
11 - 15 Lacs
Gurugram
Work from Office
Project description We are looking for an experienced Data Engineer to contribute to the design, development, and maintenance of our database systems. This role will work closely with our software development and IT teams to ensure the effective implementation and management of database solutions that align with client's business objectives. Responsibilities The successful candidate would be responsible for managing technology in projects and providing technical guidance/solutions for work completion: (1.) To be responsible for providing technical guidance/solutions (2.) To ensure process compliance in the assigned module and participate in technical discussions/reviews (3.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations (4.) Being self-organized, focused on develop on time and quality software Skills Must have At least 7 years of experience in development in Data specific projects. Must have working knowledge of streaming data Kafka Framework (kSQL/Mirror Maker etc) Strong programming skills in at least one of these programming language Groovy/Java Good knowledge of Data Structure, ETL Design, and storage. Must have worked in streaming data environments and pipelines Experience working in near real-time/Streaming Data pipeline development using Apache Spark/Streamsets/ Apache NIFI or similar frameworks Nice to have N/A Other Languages EnglishB2 Upper Intermediate Seniority Senior
Posted 2 days ago
1.0 - 4.0 years
6 - 10 Lacs
Pune
Work from Office
To ensure successful initiation, planning, execution, control and completion of the project by guiding team members on technical aspects, conducting reviews of technical documents and artefacts. Lead project development, production support and maintenance activities. Fill and ensure timesheets are completed, as is the invoicing process, on or before the deadline. Lead the customer interface for the project on an everyday basis, proactively addressing any issues before they are escalated. Create functional and technical specification documents. Track open tickets/ incidents in queue and allocate tickets to resources and ensure that the tickets are closed within the deadlines. Ensure analysts adhere to SLAs/KPIs/OLAs. Ensure that all in the delivery team, including self, are constantly thinking of ways to do things faster, better or in a more economic manner. Lead and ensure project is in compliance with Software Quality Processes and within timelines. Review functional and technical specification documents. Serve as the single point of contact for the team to the project stakeholders. Promote team work, motivate, mentor and develop subordinates. Provide application production support as per process/RACI (Responsible, Accountable, Consulted and Informed) Matrix.
Posted 2 days ago
6.0 - 11.0 years
6 - 11 Lacs
Hyderabad
Remote
We are seeking a highly skilled and proactive Senior Database Administrator to join our team. This hybrid role blends traditional DBA responsibilities with modern data engineering tasks to support our Comparative data loads and ensure optimal performance of critical database systems. Youll play a key role in scaling our data infrastructure, diagnosing performance bottlenecks, and contributing to architectural improvements. Key Responsibilities: Data Architecture & Scalability o Redesign and improve JSON blob storage strategies that have reached scalability limits. o Evaluate and optimize complex data structures spanning five core applications. Performance Tuning & Optimization o Analyze and resolve performance issues across PostgreSQL, MSSQL, and Snowflake environments. Refactor inefficient queries and recommend index strategies or schema redesigns. Support and tune Amazon RDS PostgreSQL deployments for high availability and performance. Data Engineering Support o Contribute to the design and maintenance of ETL pipelines that support Comparative data loads. Collaborate with data engineering teams on pipeline orchestration, transformation, and delivery. Database Administration Ensure high availability, backups, and disaster recovery across database platforms. Monitor and manage database health, security, and compliance. GIS/Spatial Data (Helpful) Leverage experience with postGIS and/or spatialite to enhance location based features where applicable. Technical Environment: Databases: PostgreSQL (Amazon RDS), MSSQL, Snowflake Data Engineering & Orchestration: Apache Spark, dbt, Airflow, SSIS Cloud & Infrastructure: AWS (including Infrastructure as Code practices) Preferred Qualifications: 6+ years of experience in database administration and performance tuning. Proficiency with JSON data modeling and optimization. Strong SQL expertise with proven ability to refactor complex queriesHands-on . Experience with Amazon RDS, particularly PostgreSQL. Familiarity with Apache Spark for distributed data processing. Familiarity with dbt for data modeling and transformations. Familiarity with Apache Airflow. Familiarity with MonetDB. Exposure to AWS Infrastructure as Code (IaC) using tools like CloudFormation or Terraform. (Nice-to-have) Experience working with spatial data formats (postGIS/spatialite) Strong collaboration skills and ability to work in a cross-functional, agile environment
Posted 2 days ago
6.0 - 11.0 years
8 - 14 Lacs
Pune
Work from Office
Responsibilities: designing, developing, and maintaining scalable data pipelines using Databricks, PySpark, Spark SQL, and Delta Live Tables. Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and pipelines. Implement best practices for data engineering, including data quality, and data security. Optimize and troubleshoot complex data workflows to ensure high performance and reliability. Develop and maintain documentation for data engineering processes and solutions. Requirements: Bachelor's or Master's degree. Proven experience as a Data Engineer, with a focus on Databricks, PySpark, Spark SQL, and Delta Live Tables. Strong understanding of data warehousing concepts, ETL processes, and data modelling. Proficiency in programming languages such as Python and SQL. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong leadership and communication skills, with the ability to mentor and guide team members.
Posted 2 days ago
8.0 - 13.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Skill Extensive experience with Google Data Products (Cloud Data Fusion,BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion,BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle
Posted 2 days ago
0.0 - 5.0 years
3 - 8 Lacs
Gurugram
Work from Office
Practice Overview Practice: Data and Analytics (DNA) - Analytics Consulting At Oliver Wyman DNA, we partner with clients to solve tough strategic business challenges with the power of analytics, technology, and industry expertise. We drive digital transformation, create customer-focused solutions, and optimize operations for the future. Our goal is to achieve lasting results in collaboration with our clients and stakeholders. We value and offer opportunities for personal and professional growth. Join our entrepreneurial team focused on delivering impact globally. Our Mission and Purpose Mission: Leverage Indias high-quality talent to provide exceptional analytics-driven management consulting services that empower clients globally to achieve their business goals and drive sustainable growth, by working alongside Oliver Wyman consulting teams. Purpose: Our purpose is to bring together a diverse team of highest-quality talent, equipped with innovative analytical tools and techniques to deliver insights that drive meaningful impact for our global client base. We strive to build long-lasting partnerships with clients based on trust, mutual respect, and a commitment to deliver results. We aim to build a dynamic and inclusive organization that attracts and retains the top analytics talent in India and provides opportunities for professional growth and development. Our goal is to provide a sustainable work environment while fostering a culture of innovation and continuous learning for our team members. The Role and Responsibilities We have open positions ranging from Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Roles and levels We are hiring for engineering role across the levels from Data Engineer to Lead Data Engineer level for experience ranging from 0-8 years..
Posted 2 days ago
5.0 - 10.0 years
8 - 12 Lacs
Lucknow
Work from Office
Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones
Posted 2 days ago
5.0 - 10.0 years
8 - 12 Lacs
Nashik
Work from Office
Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones
Posted 2 days ago
6.0 - 11.0 years
5 - 9 Lacs
Hyderabad
Work from Office
6+ years of experience in Data engineering projects using COSMOS DB- Azure Databricks (Min 3-5 projects) Strong expertise in building data engineering solutions using Azure Databricks, Cosmos DB Strong T-SQL programming skills or with any other flavor of SQL Experience working with high volume data, large objects, complex data transformations Experience working in DevOps environments integrated with GIT for version control and CI/CD pipeline. Good understanding of data modelling for data warehouse and data marts Strong verbal and written communication skills Ability to learn, contribute and grow in a fast phased environment Nice to have: Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, ADLS Gen2, Azure Events Hub Experience using Jira and ServiceNow in project environments Experience in implementing Datawarehouse and ETL solutions
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
Nagpur
Work from Office
Job OverviewBranch launched in India in 2019 and has seen rapid adoption and growth. As a company, we are passionate about our customers, fearless in the face of barriers, and driven by data.We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the team’s skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy.QualificationsExperience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelor’s or Master’s degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones
Posted 2 days ago
2.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Machine Learning Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Machine Learning Engineers with expertise in building, deploying, and optimizing AI models. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2-6 months, or freelancing. Be a part of Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Most likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design, optimize, and deploy machine learning models; implement feature engineering and scaling pipelines. Use deep learning frameworks (TensorFlow, PyTorch) and manage models in production (Docker, Kubernetes). Automate workflows, ensure model versioning, logging, and real-time monitoring; comply with security and regulations. Work with large-scale data, develop feature stores, and implement CI/CD pipelines for model retraining and performance tracking. Required Skills: . Proficiency in machine learning, deep learning, and data engineering (Spark, Kafka). Expertise in MLOps, automation tools (Docker, Kubernetes, Kubeflow, MLflow, TFX), and cloud platforms (AWS, GCP, Azure). Strong knowledge of model deployment, monitoring, security, compliance, and responsible AI practices. Nice to Have:. Experience with A/B testing, Bayesian optimization, and hyperparameter tuning. Familiarity with multi-cloud ML deployments and generative AI technologies (LLM fine-tuning, FAISS). What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 2 days ago
2.0 - 5.0 years
5 - 9 Lacs
Mumbai
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Machine Learning Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Machine Learning Engineers with expertise in building, deploying, and optimizing AI models. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2-6 months, or freelancing. Be a part of Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Most likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design, optimize, and deploy machine learning models; implement feature engineering and scaling pipelines. Use deep learning frameworks (TensorFlow, PyTorch) and manage models in production (Docker, Kubernetes). Automate workflows, ensure model versioning, logging, and real-time monitoring; comply with security and regulations. Work with large-scale data, develop feature stores, and implement CI/CD pipelines for model retraining and performance tracking. Required Skills: . Proficiency in machine learning, deep learning, and data engineering (Spark, Kafka). Expertise in MLOps, automation tools (Docker, Kubernetes, Kubeflow, MLflow, TFX), and cloud platforms (AWS, GCP, Azure). Strong knowledge of model deployment, monitoring, security, compliance, and responsible AI practices. Nice to Have:. Experience with A/B testing, Bayesian optimization, and hyperparameter tuning. Familiarity with multi-cloud ML deployments and generative AI technologies (LLM fine-tuning, FAISS). What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 2 days ago
2.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Machine Learning Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Machine Learning Engineers with expertise in building, deploying, and optimizing AI models. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2-6 months, or freelancing. Be a part of Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Most likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design, optimize, and deploy machine learning models; implement feature engineering and scaling pipelines. Use deep learning frameworks (TensorFlow, PyTorch) and manage models in production (Docker, Kubernetes). Automate workflows, ensure model versioning, logging, and real-time monitoring; comply with security and regulations. Work with large-scale data, develop feature stores, and implement CI/CD pipelines for model retraining and performance tracking. Required Skills: . Proficiency in machine learning, deep learning, and data engineering (Spark, Kafka). Expertise in MLOps, automation tools (Docker, Kubernetes, Kubeflow, MLflow, TFX), and cloud platforms (AWS, GCP, Azure). Strong knowledge of model deployment, monitoring, security, compliance, and responsible AI practices. Nice to Have:. Experience with A/B testing, Bayesian optimization, and hyperparameter tuning. Familiarity with multi-cloud ML deployments and generative AI technologies (LLM fine-tuning, FAISS). What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 2 days ago
2.0 - 5.0 years
5 - 9 Lacs
Kolkata
Work from Office
Step into the world of AI innovation with the Experts Community of Soul AI (By Deccan AI). We are looking for Indias top 1% Machine Learning Engineers for a unique job opportunity to work with the industry leaders. Who can be a part of the community. We are looking for top-tier Machine Learning Engineers with expertise in building, deploying, and optimizing AI models. If you have experience in this field then this is your chance to collaborate with industry leaders. Whats in it for you. Pay above market standards. The role is going to be contract based with project timelines from 2-6 months, or freelancing. Be a part of Elite Community of professionals who can solve complex AI challenges. Work location could be:. Remote (Most likely). Onsite on client location. Deccan AIs OfficeHyderabad or Bangalore. Responsibilities:. Design, optimize, and deploy machine learning models; implement feature engineering and scaling pipelines. Use deep learning frameworks (TensorFlow, PyTorch) and manage models in production (Docker, Kubernetes). Automate workflows, ensure model versioning, logging, and real-time monitoring; comply with security and regulations. Work with large-scale data, develop feature stores, and implement CI/CD pipelines for model retraining and performance tracking. Required Skills: . Proficiency in machine learning, deep learning, and data engineering (Spark, Kafka). Expertise in MLOps, automation tools (Docker, Kubernetes, Kubeflow, MLflow, TFX), and cloud platforms (AWS, GCP, Azure). Strong knowledge of model deployment, monitoring, security, compliance, and responsible AI practices. Nice to Have:. Experience with A/B testing, Bayesian optimization, and hyperparameter tuning. Familiarity with multi-cloud ML deployments and generative AI technologies (LLM fine-tuning, FAISS). What are the next steps. Register on our Soul AI website. Our team will review your profile. Clear all the screening roundsClear the assessments once you are shortlisted. Profile matching and Project AllocationBe patient while we align your skills and preferences with the available project. Skip the Noise. Focus on Opportunities Built for You!.
Posted 2 days ago
9.0 - 14.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. A Senior Data Engineer designs and oversees the entire data infrastructure, data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. Key Responsibilities : Oversee the entire data infrastructure to ensure scalability, operation efficiency and resiliency. - Mentor junior data engineers within the organization. - Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric). - Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 & Azure Blob storage). - Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions. - Optimize data pipelines in the Azure environment for performance, scalability, and reliability. - Ensure data quality and integrity through data validation techniques and frameworks. - Develop and maintain documentation for data processes, configurations, and best practices. - Monitor and troubleshoot data pipeline issues to ensure timely resolution. - Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge. - Manage the CI/CD process for deploying and maintaining data solutions.
Posted 2 days ago
5.0 - 10.0 years
8 - 12 Lacs
Ghaziabad
Work from Office
are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities - Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies. Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role. Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM). Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones.
Posted 2 days ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Bengaluru
Work from Office
Role & responsibilities Hands-on experience with AWS services including S3,Glue, API Gateway, and SQS. Strong skills in data engineering on AWS, with proficiency in Python ,pyspark & SQL. Experience with batch job scheduling and managing data dependencies. Knowledge of data processing tools like Spark and Airflow. Automate repetitive tasks and build reusable frameworks to improve efficiency. Provide Run/DevOps support and manage the ongoing operation of data services.
Posted 2 days ago
10.0 - 15.0 years
8 - 13 Lacs
Pune
Work from Office
: Job TitleSenior Engineer, AVP LocationPune, India Role Description Technology Engineer within Archiving Tech product family to take on the responsibility of developing the product code, implement technical solutions and configures applications in different environments in response to business problems. The engineer is expected to focus on requirements of the business and proposes the technical design of the application or its components, investigates and proposes appropriate technologies to be used, creates re-usable frameworks and drives standardization where possible in line with banks standard and solutions. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Hands on experience in application development / enhancement / bug fixing Ability to overcome technical challenges . Ensures development happens for all Software Components in accordance with Detailed Software specification, the functional design and the technical design document. Verifies the developed source code by reviews (4-eyes principle). Contributes to quality assurance by writing unit and functional tests. Contributes to problem and root cause analysis. Integrates software components following the integration strategy. Ensures that all code changes end up in Change Items (CIs). Where applicable, develops routines to deploy CIs to the target environments. Collaborates with colleagues participating in other stages of the Software Development Lifecycle (SDLC). Your skills and experience To excel in this role, you should possess strong understanding of technologies as under: Expert level in Oracle PL/SQL Highly proficient in SQL Should be comfortable to work with linux, python scripting & any scheduling tool preferably BMC Control-M Proven experience in data engineering, data modelling & building data pipelines Good understanding of Conceptual, logical & physical data model building Exposure to working with Agile Methodologies. SDLC Tools - JIRA, Sonar, Veracode /JFrog, TeamCity, BitBucket Exposure to Java, SpringBoot, J2EE, REST APIs, Microservices will be added advantage. CloudExposure to any public cloud preferably GCP. Strong analytical skills. 10+ Years of technology experience, continuous hands-on coding exposure, and ability to drive solutions. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player and open minded approach Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge and expertise to team members. Ability to design and write code in accordance with provided business requirements Relevant Financial Services experience. Ability to work in a fast paced environment with competing and alternating priorities with a constant focus on delivery. Ability to balance business demands and IT fulfilment in terms of standardization, reducing risk and increasing IT flexibility. Candidate is expected to have high desire to learn new technologies and implement various solutions in fast paced environment. How well support you
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane