Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Collaborate with business stakeholders, data architects, and IT teams to gather and understand data migration requirements. Analyze legacy banking and insurance systems (e.g., core banking, policy admin, claims, CRM) to identify data structures and dependencies. Work with large-scale datasets and understand big data architectures (e.g., Hadoop, Spark, Hive) to support scalable data migration and transformation. Perform data profiling, cleansing, and transformation using SQL and ETL tools, with the ability to understand and write complex SQL queries and interpret the logic implemented in ETL workflows. Develop and maintain data mapping documents and transformation logic specific to financial and insurance data (e.g., customer KYC, transactions, policies, claims). Validate migrated data against business rules, regulatory standards, and reconciliation reports. Support UAT by preparing test cases and validating migrated data with business users. Ensure data privacy and security compliance throughout the migration process. Document issues, risks, and resolutions related to data quality and migration.
Posted 1 month ago
9.0 - 14.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Broad expertise in Fincrime Monitoring, AML, KYC, Sanctions Screening, Payment Screening, Fraud etc Proven risk & regulatory experience in financial services gained through management consulting, banking or other relevant industry practitioner or regulatory roles End to end project implementation and cross functional stakeholder management experience including agile project delivery Seasoned business analyst with experience in requirements analysis, requirements management and documentation, exposure to tools like JIRA, Confluence etc hands on with SQL queries Bachelor degree from a reputable Institute. Master degree preferably in a quantitative field Business, Data Science, Statistics, Computer Science, etc. Comfortable with ideation, solution design and development of thought leadership materials and documents to support practice development efforts Exposure to leading vendor products like Actimize, Fircosoft etc is a plus Experience in Data Science, Analytics, AI ML. Gen AI, Data Management, Data Architectures, Data Governance, platforms and applications is a plus Exposure to Consultative sales business development, pre-sales, RFP and proposal development and client management experience is a plus
Posted 1 month ago
12.0 - 17.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements.
Posted 1 month ago
5.0 - 7.0 years
7 - 11 Lacs
Pune
Work from Office
Educational Bachelor Of Technology,Bachelor of Engineering Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag. While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have bachelor's or equivalent degree with a minimum of 5 years of experience Should adhere to Data Architecture, Modelling and Coding guidelines Should understand functional requirements Preparation of Design documents and/or Technical Documents Should have Experience in: HANA Modelling - Calculation views, Stored procedures, Scalar & Table functions, Performance tuning techniques. XS Development - XSO Data, XSJS Services, Debugging. DS - Job development end to end including Transformation, DS Scripting, Consume External services. Mandatory Skills – SAP Native HANA, Implementation, Configuration, Safe Agile Methodology. Preferred Skills: SAP HANA XS / Native HANA SAP HANA XS / Native HANA-SAP HANA XS / Native HANA
Posted 1 month ago
2.0 - 3.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Educational Bachelor Of Technology,Bachelor of Engineering Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledgeLocation of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have bachelor's or equivalent degree with a minimum of 2 years of experience. Should adhere to Data Architecture, Modelling and Coding guidelines. Should understand functional requirements. Preparation of Design documents and/or Technical Documents. Should have Experience in: HANA Modelling - Calculation views, Stored procedures, Scalar & Table functions, Performance tuning techniques. XS Development - XSO Data, XSJS Services, Debugging. DS - Job development end to end including Transformation, DS Scripting, Consume External services. Mandatory Skills – SAP Native HANA, Implementation, Configuration, Safe Agile Methodology. Preferred Skills: SAP HANA XS / Native HANA SAP HANA XS / Native HANA-SAP HANA XS / Native HANA
Posted 1 month ago
9.0 - 14.0 years
12 - 16 Lacs
Pune
Work from Office
Skills requiredStrong SQL(minimum 6-7 years experience), Datawarehouse, ETL Data and Client Platform Tech project provides all data related services to internal and external clients of SST business. Ingestion team is responsible for getting and ingesting data into Datalake. This is Global team with development team at Shanghai, Pune, Dublin and Tampa. Ingestion team uses all Big Data technologies like Impala, Hive, Spark and HDFS. Ingestion team uses Cloud technologies such as Snowflake for cloud data storage. Responsibilities: You will gain an understanding of the complex domain model and define the logical and physical data model for the Securities Services business. You will also constantly improve the ingestion, storage and performance processes by analyzing them and possibly automating them wherever possible. You will be responsible for defining standards and best practices for the team in the areas of Code Standards, Unit Testing, Continuous Integration, and Release Management. You will be responsible for improving performance of queries from lake tables views You will be working with a wide variety of stakeholders source systems, business sponsors, product owners, scrum masters, enterprise architects and possess excellent communication skills to articulate challenging technical details to various class of people. You will be working in Agile Scrum and complete all assigned tasks JIRAs as per Sprint timelines and standards. Qualifications 5 8 years of relevant experience in Data Development, ETL and Data Ingestion and Performance optimization. Strong SQL skills are essential experience writing complex queries spanning multiple tables is required. Knowledge of Big Data technologies Impala, Hive, Spark nice to have. Working knowledge of performance tuning of database queries understanding the inner working of the query optimizer, query plans, indexes, partitions etc. Experience in systems analysis and programming of software applications in SQL and other Big Data Query Languages. Working knowledge of data modelling and dimensional modelling tools and techniques. Knowledge of working with high volume data ingestion and high volume historic data processing is required. Exposure to scripting language like shell scripting, python is required. Working knowledge of consulting project management techniques methods Knowledge of working in Agile Scrum Teams and processes. Experience in data quality, data governance, DataOps and latest data management techniques a plus. Education Bachelors degree University degree or equivalent experience
Posted 1 month ago
12.0 - 15.0 years
16 - 20 Lacs
Hyderabad
Work from Office
: We are seeking an experienced Enterprise Architect to lead solutioning and architectural direction for mission-critical technology programs in the Healthcare and Life Sciences domain. The ideal candidate will have a deep technical background in integration architecture, web-scale application development, and data architecture, particularly in designing solutions for member/patient and provider engagement. You will work with solution leaders and client stakeholders to define target architectures, create scalable systems and lead integration of digital front-end systems with core clinical, claims, and data platforms. Your ability to balance business needs with technical depth will drive digital transformation across payer, provider, and pharmaceutical clients. Key Responsibilities: Lead the architecture, design, and implementation of enterprise-grade web applications for healthcare member and provider use cases. Design and govern scalable architectures using Front End Engineering and Cloud Native Architectures Define integration blueprints that connect digital systems with core platforms (EHR, claims, CRM, Veeva, etc.) via REST, FHIR, HL7, or API gateways. Architect microservices-based solutions with appropriate separation of concerns, scalability, and maintainability. Define and guide data architecture decisions, including real-time integration, event-driven design, and use of cloud-native data stores. Assess legacy modernization paths, define transformation roadmaps, and ensure alignment to enterprise standards. Collaborate with cross-functional engineering and DevOps teams to ensure architectural alignment through delivery. Provide architecture leadership in RFP responses, client pitches, and delivery planning sessions. Required Experience & Skills: 12-15 years of IT experience, with at least 5 years in enterprise architecture or lead solution architecture roles. Deep experience in scalable front-end and backend development: o FrontendReact.js, Redux, Webpack, Responsive UI o BackendJava, Spring Boot, REST APIs Strong understanding of Integration Architecture, including: o RESTful services, FHIR/HL7, event-driven design (Kafka or similar) o API gateways, integration middleware (Mulesoft, Boomi, Apigee, etc.) Familiarity with Healthcare interoperability standards (FHIR, HL7, EDI 270/271, X12) Experience with data architecture patternsoperational data stores, data lakes, near-real-time streaming, data APIs Working knowledge of cloud platforms (AWS, Azure, GCP) and container-based architectures (Docker, Kubernetes). Strong grasp of enterprise security, compliance, and high-availability architectures for regulated domains like healthcare. Excellent communication skills with proven ability to engage client technology leaders (CTOs, VPs, Directors). Preferred Skills (Nice to Have): TOGAF, AWS/Azure architecture certification. Experience with Identity and Access Management (IAM), SSO, and OAuth2 implementations. Knowledge of healthcare platforms such as Epic, Cerner, or payer core systems (Facets, QNXT, HealthEdge).
Posted 1 month ago
8.0 - 13.0 years
5 - 10 Lacs
Pune
Work from Office
Data Engineer Position Summary The Data Engineer is responsible for building and maintaining data pipelines ensuring the smooth operation of data systems and optimizing workflows to meet business requirements This role will support data integration and processing for various applications Minimum Qualifications 6 Years overall IT experience with minimum 4 years of work experience in below tech skills Tech Skills Proficient in Python scripting and PySpark for data processing tasks Strong SQL capabilities with hands on experience managing big data using ETL tools like Informatica Experience with the AWS cloud platform and its data services including S3 Redshift Lambda EMR Airflow Postgres SNS and EventBridge Skilled in BASH Shell scripting Understanding of data lakehouse architecture particularly with Iceberg format is a plus Preferred Experience with Kafka and Mulesoft API Understanding of healthcare data systems is a plus Experience in Agile methodologies Strong analytical and problem solving skills Effective communication and teamwork abilities Responsibilities Develop and maintain data pipelines and ETL processes to manage large scale datasets Collaborate to design test data architectures to align with business needs Implement and optimize data models for efficient querying and reporting Assist in the development and maintenance of data quality checks and monitoring processes Support the creation of data solutions that enable analytical capabilities Contribute to aligning data architecture with overall organizational solutions
Posted 1 month ago
5.0 - 10.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Technical Skills Microsoft Purview Expertise (Required) Unified Data CatalogExperience setting up and configuring the catalog, managing collections, classifications, glossary terms, and metadata curation. Data Quality (DQ)Implementing DQ rules, defining metrics (accuracy, completeness, consistency), and using quality scorecards and reports. Data Map and ScansAbility to configure sources, schedule scans, manage ingestion, and troubleshoot scan issues. Data Insights and LineageExperience visualizing data lineage and interpreting catalog insights. Azure Platform Knowledge (Desirable) Azure Data Factory Azure Synapse Analytics Microsoft Fabric including OneLake Experience 3to 5+ years in data governance or data platform projects, ideally with enterprise clients. 2+ years implementing Microsoft Purview or similar tools (Collibra, Informatica, Alation). Hands-on experience configuring and implementing Microsoft Purview Unified Catalog and Data Quality Experience onboarding multiple data sources (on-prem, cloud). Background in data management, data architecture, or business intelligence is highly beneficial. Certifications Desirable Microsoft Certified Azure Data Engineer Associate Microsoft Certified Azure Solutions Architect Expert
Posted 1 month ago
4.0 - 9.0 years
6 - 10 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools Erwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQL databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL
Posted 1 month ago
4.0 - 9.0 years
6 - 10 Lacs
Hyderabad
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4+ years of experience in data modelling, data architecture. Proficiency in data modelling tools ERwin, IBM Infosphere Data Architect and database management systems Familiarity with different data models like relational, dimensional and NoSQl databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principles, data warehousing concepts, and data governance practices Preferred technical and professional experience Excellent analytical and problem-solving skills with a keen attention to detail. Ability to work collaboratively in a team environment and manage multiple projects simultaneously. Knowledge of programming languages such as SQL
Posted 1 month ago
7.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Data Modeller JD: We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: oDesign and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration with Data Architect: Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality and Governance: Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling ToolsERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. DatabasesSQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing TechnologiesSnowflake, Teradata, or similar. ETL ToolsInformatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data TechnologiesHadoop, Spark (optional but preferred). TechnologiesExperience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory)
Posted 1 month ago
6.0 - 11.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Gurugram
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
5.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Graduate with a minimum of 5+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Preferred technical and professional experience Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
As a Senior Backend /Lead Development Engineer you will be involved in developing automation solutions to provision and manage any infrastructure across your organization. Being developer you will be leveraging capabilities of Terrafrom and Cloud offering to drive infrastructure as code capabilities for the IBM z/OS platform. You will Work closely with frontend engineers as part of a full-stack team, collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhancing the experience of the offering. Required education Bachelor's Degree Required technical and professional expertise * 10+ years of Software development experience with zOS or zOS Sub-systems. * 8+ years Professional experience developing with Golang, Python and Ruby * Hands-on experience with z/OS system programming or administration experience * Experience with Terraform key features like Infrastructure as a code, change automation, auto scaling. * Experience working with cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. * Cloud-native mindset and solid understanding of DevOps principles in a cloud environment * Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. * Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). * Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. * Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. * Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. * Strong analytical, debugging and problem solving skills to analyse issues and defects reported by customer-facing and test teams. * Proficient in source control management tools (GitHub, ) and with Agile Life Cycle Management tools. * Soft Skills: Strong communication, collaboration, self-organization, self-study, and the ability to accept and respond constructively to critical feedback.
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Your role will be pivotal in driving innovation and efficiency within the team, while also maintaining open lines of communication with stakeholders to keep them informed of progress and developments. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of applications.- Familiarity with data warehousing concepts and methodologies.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & ResponsibilitiesData Integration:Develop and implement data pipelines using Azure Data Factory, Azure Fabric, and MDMF(Meta data Management framework) to ingest, transform, and store data from various sources. Data Modeling:Maintain data models, ensuring data quality and consistency across different databases and systems. Database Management:Manage Azure SQL Databases and other storage solutions to optimize performance and scalability. ETL Processes:Design and optimize ETL (Extract, Transform, Load) processes to ensure efficient data flow and availability for analytics and reporting. Collaboration:Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide actionable insights. Documentation:Maintain clear documentation of data architectures, data flows, and pipeline processes. Professional & Technical Skills: Proficiency in Azure services such as Azure Data Factory,Azure Fabric, MDMF Azure SQL Database. Good knowledge of SQL and experience with programming languages such as Python and Pyspark. Familiarity with data modeling techniques and data warehousing concepts. Experience with cloud architecture and data architecture best practices. Understanding of data governance and security principles. Excellent problem-solving skills and attention to detail. Strong communication skills for collaborating with technical and non-technical stakeholders. Additional Information:- The candidate should have a minimum of 3 years of experience in Microsoft Azure Data Services.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 7.0 years
9 - 14 Lacs
Mumbai
Work from Office
As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution.Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Configure Datastax Cassandra as per requirement of the project solution Design the database system specific to Cassandra in consultation with the data modelers, data architects and etl specialists as well as the microservices/ functional specialists. Thereby produce an effective database system in Cassandra according to the solution & client's needs and specifications. Interface with functional & data teams to ensure the integrations with other functional and data systems are working correctly and as designed. Participate in responsible or supporting roles in different tests or UAT that involve the DataStax Cassandra database. The role will also need to ensure that the Cassandra database is performing and error free. This will involve troubleshooting errors and performance issues and resolution of the same as well as plan for further database improvement. Ensure the database documentation & operation manual is up to date and usable. Preferred technical and professional experience Has expertise, experience and deep knowledge in the configuration, design, troubleshooting of NoSQL server software and related products on Cloud, specifically DataStax Cassandra. Has knowledge/ experience in other NoSQl/ Cloud database. Installs, configures and upgrades RDBMS or NoSQL server software and related products on Cloud.
Posted 1 month ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP S/4HANA for Product Compliance Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead end-to-end SAP EHS Global Label Management (GLM) implementations within S/4HANA Product Compliance projects. Youll be responsible for project delivery, stakeholder engagement, and ensuring regulatory alignment. Roles & Responsibilities:- Manage full-cycle implementation of SAP GLM- Define labelling strategies and oversee WWI template delivery- Coordinate cross-functional teams across product safety, compliance, and regulatory domains- Collaborate with clients and business users to gather requirements and translate them into effective EHS solutions.- Configure and maintain the SAP EHS Product Safety module, including specifications, phrase management, and data architecture.- Design and validate WWI report templates and guide ABAP developers with symbol logic, layout, and enhancements.- Implement and support SAP GLM (Global Label Management) including label determination, print control, and output conditions. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP EH&S GLM End to end implementation experience.- Deep expertise in SAP GLM, label determination logic, and print control setup- Strong knowledge of S/4HANA Product Compliance architecture- Excellent communication and team management skills- 8+ years in SAP EHS with 2+ full-cycle GLM implementations Additional Information:- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
6.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Roles and Responsibility Design and implement data models, data flows, and data pipelines to support business intelligence and analytics. Develop and maintain large-scale data warehouses and data lakes using various technologies such as Hadoop, Spark, and NoSQL databases. Collaborate with cross-functional teams to identify business requirements and develop solutions that meet those needs. Ensure data quality, integrity, and security by implementing data validation, testing, and monitoring processes. Stay up-to-date with industry trends and emerging technologies to continuously improve the organization's data architecture capabilities. Provide technical leadership and guidance on data architecture best practices to junior team members. Job Requirements Strong understanding of data modeling, data warehousing, and ETL processes. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases. Excellent problem-solving skills and ability to analyze complex business problems and develop creative solutions. Strong communication and collaboration skills to work effectively with stakeholders at all levels. Ability to design and implement scalable, secure, and efficient data architectures. Experience working in an agile environment with continuous integration and delivery.
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program ManagerRoles and responsibilities: Understand clients requirement and provide effective and efficient solution in Snowflake. Understanding data transformation and translation requirements and which tools to leverage to get the job done. Ability to do Proof of Concepts (POCs) in areas that need R&D on cloud technologies. Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Technical and Functional Skills: Masters / Bachelors degree in Engineering, Analytics, or a related field. Total 7+ years of experience with relevant ~4+ years of Hands-on experience with Snowflake utilities SnowSQL, SnowPipe, Time travel, Replication, Zero copy cloning. Strong working knowledge on Python. Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. In-depth understanding of data warehouse and ETL tools. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Experience Snowflake APIs is mandatory. Candidate must have strong knowledge in Scheduling and Monitoring using Airflow DAGs. Strong experience in writing SQL Queries, Joins, Store Procedure, User Defined Functions. Should have sound knowledge in Data architecture and design. Should have hands on experience in developing Python scripts for data manipulation. Snowflake snowpro core certification. Developing scripts using Unix, Python, etc.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40352 Jobs | Dublin
Wipro
19655 Jobs | Bengaluru
Accenture in India
18055 Jobs | Dublin 2
EY
16464 Jobs | London
Uplers
11953 Jobs | Ahmedabad
Amazon
10853 Jobs | Seattle,WA
Accenture services Pvt Ltd
10424 Jobs |
Bajaj Finserv
10110 Jobs |
Oracle
9702 Jobs | Redwood City
IBM
9556 Jobs | Armonk