Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
8 - 10 Lacs
Gurugram
Work from Office
Global Data Steward Role We are looking for a highly skilled and experienced Global Data Steward to join our team at AXALTA COATING SYSTEMS INDIA PRIVATE LIMITED. The ideal candidate will have 6-8 years of experience in data stewardship. Roles and Responsibility Develop and implement effective data stewardship strategies to ensure data quality and integrity. Collaborate with cross-functional teams to identify and prioritize data requirements. Design and maintain scalable and secure data architectures to support business growth. Ensure compliance with regulatory requirements and industry standards. Provide expert guidance on data management best practices to stakeholders. Analyze and resolve complex data-related issues to improve operational efficiency. Job Requirements Strong understanding of data stewardship principles and practices. Experience with data governance frameworks and regulations. Proficiency in data modeling, warehousing, and analytics tools. Excellent communication and collaboration skills. Ability to work in a fast-paced environment with multiple priorities. Strong problem-solving skills with attention to detail.
Posted 2 months ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
ECMS Req # /Demand ID 519826 Number of Openings 1 Duration of project 12 Months No of years experience Total 3-5Years Relevant 4+ Detailed job description - Skill Set: Backend Developer (4+ years) with strong expertise in PySpark and SQL technologies to develop and maintain high-performance big data architecture. Should have hands-on experience in hive, impala, airflow. They should have project experience in agile methodologies. Mandatory Skills Python-Big Data Spark and Good Communication skill Vendor Proposed Rate (as per ECMS system) 8000 INR / Day Work Location Any Infosys DC Hybrid/remote/WFO Hybrid BGV Pre/Post onboarding Pre Onboarding - Final BGV
Posted 2 months ago
8.0 - 13.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Job Description We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Qualifications Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python
Posted 2 months ago
4.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics AI Management Level Senior Associate Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . Summary We are in search of a Senior Python Developer to build customercentric applications that prioritize functionality and efficiency. The role will involve active participation in all stages of the software development lifecycle. Responsibilities Developing highquality, scalable software solutions Desiging of individual modules in a project Ensuring code quality, security, and performance through rigorous testing and review Optimizing code to enhance efficiency and maintainability Staying uptodate on industry trends, emerging technologies, and best practices Taking ownership of projects from conception to delivery, ensuring successful outcomes Required skills experience Python Programming Strong proficiency in Python programming and a proven track record of successful projects. Java Programming Understanding of Java programming Databases Expertise in relational SQL databases, ability to design and model for project modules, developing and optimizing stored procedures. Web Development Experience with Python and web frameworks like Django, Flask. API Development Experience with designing and implementing RESTful APIs. Cloud Platform Familiarity with cloud platforms and containerization technologies (e.g., AWS, Docker, and Kubernetes) DevOps Experience with (CI/CD) pipelines and tools Software Engineering Understanding of SDLC principles and methodologies. Problemsolving Ability to analyze problems and find solutions. Communication Strong communication and collaboration skill to work with crossfunctional teams Mandatory skill sets Python, SQL, Python Programming, Web Development Preferred skill sets Python, SQL, Python Programming, Web Development Years of experience required 4 to 8 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Bachelor of Engineering, Bachelor in Business Administration Degrees/Field of Study preferred Required Skills Python (Programming Language), Structured Query Language (SQL) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 2 months ago
1.0 - 4.0 years
9 - 13 Lacs
Bengaluru
Work from Office
We are looking for a dynamic, self-starter Senior BIE for IES Shopping Analytics and Science Team (AST) to guide Amazon Bazaar program in India with analytics and data-driven insights. You will be working in one of the worlds largest and most complex data warehouse environments. You must have a track record of churning out insights and make actionable recommendations to audiences of varying technical aptitude that directly impact organizational strategic decisions and priorities. Being able to thrive in an ambiguous, fast-moving environment and prioritizing work is essential, as is a mind for innovation and learning through rapidly evolving and new technologies. This role provides an opportunity to develop original ideas, approaches, and solutions in a competitive and ever-changing business climate. Conduct deep dive analyses of business problem statements and formulate conclusions and recommendations to leadership Share written recommendations and insights for key stakeholders that will help shape organizational strategic decisions and priorities Contribute to the design, implementation, and delivery of BI solutions for complex and ambiguous problems Simplify and automate reporting, audits, and other data-driven activities Partner with other BIEs to enhance data infrastructure, data availability, and broad access to customer insights Develop and drive best practices in data integrity, consistency, analysis, validations, and documentation Learn new technology and techniques to meaningfully support internal stakeholders and process innovation About the team IES Shopping Analytics and Science Team (AST) has a vision to embed a data culture deeply in our IES Shopping Experience organization, fostering invention through insights, and building a robust data architecture to support business needs. We spin the insights flywheel by growing a pool of bar-raisers and diverse data professionals, which empowers us to continuously enhance our data capabilities, holistically covering disciplines of Data Engineering, Business Intelligence, Analytics, and Machine Learning. 10+ years of professional or military experience 8+ years of SQL experience Experience programming to extract, transform and clean large (multi-TB) data sets Experience with theory and practice of design of experiments and statistical analysis of results Experience in scripting for automation (e.g. Python) and advanced SQL skills. Experience with theory and practice of information retrieval, data science, machine learning and data mining Experience working directly with business stakeholders to translate between data and business needs Experience managing, analyzing and communicating results to senior leadership Experience working as a BIE in a technology company Experience with AWS technologies Experience using Cloud Storage and Computing technologies such as AWS Redshift, S3, Hadoop, etc.
Posted 2 months ago
6.0 - 9.0 years
8 - 11 Lacs
Chennai
Work from Office
Job Title: Data Modeller - GCP Experience: 6-9 Years Work Type: On-site Work Location: Chennai (Work from Client Office - Mandatory) Job Description We are seeking a skilled Data Modeller with strong experience in data modelling for OLTP and OLAP systems, particularly within Google Cloud Platform (GCP). The ideal candidate will be hands-on with designing efficient, scalable data architectures and have a solid grasp of performance tuning and cloud-based databases. Key Responsibilities: Design and implement Conceptual, Logical, and Physical Data Models for OLTP and OLAP systems Apply best practices in data indexing, partitioning, and sharding for optimized performance Use data modelling tools (preferably DBSchema) to support and document database design Ensure data architecture supports near real-time reporting and application performance Collaborate with cross-functional teams to translate business requirements into data structures Work with GCP database technologies like AlloyDB, CloudSQL, and BigQuery Validate and improve database performance metrics through continuous optimization Must-Have Skills: GCP: AlloyDB, CloudSQL, BigQuery Strong hands-on experience with data modelling tools (DBSchema preferred) Expertise in OLTP & OLAP data models, indexing, partitioning, and data sharding Deep understanding of database performance tuning and system architecture Good to Have: Functional knowledge of the mutual fund industry Exposure to data governance and security best practices in the cloud
Posted 2 months ago
4.0 - 9.0 years
6 - 10 Lacs
Pune
Work from Office
As an Data Engineer at IBM you wi harness the power of data to unvei captivating stories and intricate patterns. You' contribute to data gathering, storage, and both batch and rea-time processing. Coaborating cosey with diverse teams, you' pay an important roe in deciding the most suitabe data management systems and identifying the crucia data required for insightfu anaysis. As a Data Engineer, you' tacke obstaces reated to database integration and untange compex, unstructured data sets. In this roe, your responsibiities may incude: Impementing and vaidating predictive modes as we as creating and maintain statistica modes with a focus on big data, incorporating a variety of statistica and machine earning techniques Designing and impementing various enterprise search appications such as Easticsearch and Spunk for cient requirements Work in an Agie, coaborative environment, partnering with other scientists, engineers, consutants and database administrators of a backgrounds and discipines to bring anaytica rigor and statistica methods to the chaenges of predicting behaviours. Buid teams or writing programs to ceanse and integrate data in an efficient and reusabe manner, deveoping predictive or prescriptive modes, and evauating modeing resuts Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise 4+ years of experience in data modeing, data architecture. Proficiency in data modeing toos Erwin, IBM Infosphere Data Architect and database management systems Famiiarity with different data modes ike reationa, dimensiona and NoSQL databases. Understanding of business processes and how data supports business decision making. Strong understanding of database design principes, data warehousing concepts, and data governance practices Preferred technica and professiona experience Exceent anaytica and probem-soving skis with a keen attention to detai. Abiity to work coaborativey in a team environment and manage mutipe projects simutaneousy. Knowedge of programming anguages such as SQL
Posted 2 months ago
10.0 - 15.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Job Description We are seeking a highly skilled and experienced Data Architect to design, implement, and manage the data infrastructure As a Data Architect, you will play a key role in shaping the data strategy, ensuring data is accessible, reliable, and secure across the organization You will work closely with business stakeholders, data engineers, and analysts to develop scalable data solutions that support business intelligence, analytics, and operational needs, Key Responsibilities Design and implement effective database solutions (on-prem / cloud) and data models to store and retrieve data for various applications within FinCrime domain, Develop and maintain robust data architecture strategies aligned with business objectives, Define data standards, frameworks, and governance practices to ensure data quality and integrity, Collaborate with data engineers, software developers, and business stakeholders to integrate data systems and optimize data pipelines, Evaluate and recommend tools and technologies for data management, warehousing, and processing, Create and maintain documentation related to data models, architecture diagrams, and processes, Ensure data security and compliance with relevant regulations (e-g , GDPR, HIPAA, CCPA), Participate in capacity planning and growth forecasting for the organizations data infrastructure, Through various POCs, assess and compare multiple tooling options and deliver use-cases based on MVP model as per expectations, Requirements Experience: 10+ years of experience in data architecture, data engineering, or related roles, Proven experience with relational and NoSQL databases, Experience with FinCrime domain applications and reporting, Strong experience with ETL tools, data warehousing, and data lake solutions, Familiarity with other data technologies such as Spark, Kafka, Snowflake, Skills Strong analytical and problem-solving skills, Proficiency in data modelling tools (e-g , ER/Studio, Erwin), Excellent understanding of database management systems and data security, Knowledge of data governance, metadata management, and data lineage, Strong communication and interpersonal skills to collaborate across teams, Subject matter expertise within the FinCrime, Preferred Qualifications
Posted 2 months ago
2.0 - 6.0 years
8 - 12 Lacs
Hyderabad
Work from Office
About The Job Sanofi is a global life sciences company committed to improving access to healthcare and supporting the people we serve throughout the continuum of care From prevention to treatment, Sanofi transforms scientific innovation into healthcare solutions, in human vaccines, rare diseases, multiple sclerosis, oncology, immunology, infectious diseases, diabetes and cardiovascular solutions and consumer healthcare More than 110,000 people in over 100 countries at Sanofi are dedicated to making a difference on patientsdaily life, wherever they live and enabling them to enjoy a healthier life, As a company with a global vision of drug development and a highly regarded corporate culture, Sanofi is recognized as one of the best pharmaceutical companies in the world and is pioneering the application of Artificial Intelligence (AI) with strong commitment to develop advanced data standards to increase reusability & interoperability and thus accelerate impact on global health, The R&D Data Office serves as a cornerstone to this effort Our team is responsible for cross-R&D data strategy, governance, and management We sit in partnership with Business and Digital, and drive data needs across priority and transformative initiatives across R&D Team members serve as advisors, leaders, and educators to colleagues and data professionals across the R&D value chain As an integral team member, you will be responsible for defining how R&D's structured, semi-structured and unstructured data will be stored, consumed, integrated / shared and reported by different end users such as scientists, clinicians, and more You will also be pivotal in the development of sustainable mechanisms for ensuring data are FAIR (findable, accessible, interoperable, and reusable), Position Summary The R&D Data Modeler develops conceptual and logical data models for initiatives, programs, and cross-R&D capabilities This role is critical for creation of data models and ontologies, and upkeep of models even beyond the conclusion of a project Data modelers will apply and assist in the definition and governance of data modelling and design standards, tools, best practices, and related development of any R&D data capability, Main Responsibilities Engage in data management and analytics projects; understand, advise, and execute on data flow optimization (e-g , data capture, integration and use across R&D) Understand the data-related needs for various cross-R&D capabilities (e-g , data catalog, master data management etc) and associated initiatives Design conceptual and logical data models and data dictionaries/ontologies to cater to R&D business needs and functional requirements; lead validation of physical data models Interact with business, R&D Digital, and other data collaborators to translate needs into data solutions Understand market trends for data modelling tools and metadata management capabilities; provides input into selection of tools and any necessary migration into companys environment Understand data governance policies, standards and procedures for R&D data Serve as point of contact for data integration topics within solutions (e-g , technology), from source systems to data consumers; define process, tools, and testing requirements Maintain modelling and naming standards, and data harmonization, metadata management, and source-to-target data mapping documentation Evaluate and influence projects while serving as ?voice-of-business?; map systems/interfaces for data management, set standards for future state, and close gap from current-to-future state Serve as technical and data consultant for R&D initiatives with major platforms/technology implementations Maintain documentation and act as an expert on data definitions, data standards, data flows, legacy data structures / hierarchies, common data models, data harmonization etc for R&D functions Educate and guide R&D teams on standards and information management principles, methodologies, best practices, etc Deliverables Conduct requirements gathering from business analysts, data scientists, and other stakeholders Formulate strategies, and query optimizations to enhance data retrieval speed Develop complex and scalable data models aligned to organizations long term strategic goals Formulate data governance frameworks, policies, and standards Establish best practices for data modeling ensuring interoperability among systems, applications & data sources About you Experience: 5+ years of experience in business data management, information architecture, technology or other related field Functional skills: Demonstrated ability to understand end-to-end data use and business needs Knowledge of R&D data and data domains (e-g , across research, clinical, regulatory etc) Experience with creating and applying data modelling best practices and naming conventions Strong analytical problem-solving skills Demonstrated strong attention to detail, quality, time management and customer focus Excellent written and oral communications skills Strong networking, influencing and negotiating skills and superior problem-solving skills Demonstrated willingness to make decisions and to take responsibility for such Excellent interpersonal skills (team player) Technical Skills Experience with data management practices and technologies (e-g , Collibra, Informatica etc) Familiar with databases (relational, dimensional, NoSQL, etc) and concepts of data integrity Strong knowledge of data architecture (e-g , data lake, data virtualization, hubs etc) and modelling (e-g , 3nf etc) is required Experience in big data infrastructures (e-g , Hadoop, NOSQL etc) Experience with SDLC and pharma R&D platforms; experience with requirements gathering, system design, and validation/quality/compliance requirements Experience managing technology and/or data warehouse projects Familiarity with relationship databases and entity-relationship data modelling Experience with hierarchical data models from conceptualization to database optimization Education: Bachelors in Computer Science, Engineering, Mathematics, Statistics, or related; Masters preferred Languages: English Pursue Progress Discover Extraordinary, Progress doesnt happen without people people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen You can be one of those people Chasing change, embracing new ideas and exploring all the opportunities we have to offer Lets pursue progress And lets discover extraordinary together, At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity, Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi,! null
Posted 2 months ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Job title: R&D Data Modeling Manager Associate Location: Hyderabad Sanofi is a global life sciences company committed to improving access to healthcare and supporting the people we serve throughout the continuum of care From prevention to treatment, Sanofi transforms scientific innovation into healthcare solutions, in human vaccines, rare diseases, multiple sclerosis, oncology, immunology, infectious diseases, diabetes and cardiovascular solutions and consumer healthcare More than 110,000 people in over 100 countries at Sanofi are dedicated to making a difference in patientsdaily lives, wherever they live and enabling them to enjoy a healthier life, As a company with a global vision of drug development and a highly regarded corporate culture, Sanofi is recognized as one of the best pharmaceutical companies in the world and is pioneering the application of Artificial Intelligence (AI) with a strong commitment to developing advanced data standards to increase reusability & interoperability and thus accelerate impact on global health, The R&D Data Office serves as a cornerstone of this effort Our team is responsible for cross-R&D data strategy, governance, and management We partner with Business and Digital and drive data needs across priority and transformative initiatives across R&D Team members serve as advisors, leaders, and educators to colleagues and data professionals across the R&D value chain As an integral team member, you will be responsible for defining how R&D's structured, semi-structured and unstructured data will be stored, consumed, integrated / shared and reported by different end users such as scientists, clinicians, and more You will also be pivotal in developing sustainable mechanisms for ensuring data are FAIR (findable, accessible, interoperable, and reusable), Position Summary The primary responsibility of this position is to support semantic integration and data harmonization across pharmaceutical R&D functions In this role, you will design and implement ontologies and controlled vocabularies that enable interoperability of scientific, clinical, and operational data Your work will be critical in accelerating discovery, improving data reuse, and enhancing insights across the drug development lifecycle, Main Responsibilities Develop, maintain, and govern ontologies and semantic models for key pharmaceutical domains, including preclinical, clinical, regulatory, and translational research Design and implement controlled vocabularies and taxonomies to standardize terminology across experimental data, clinical trials, biomarkers, compounds, and regulatory documentation Collaborate with cross-functional teams including chemists, biologists, pharmacologists, data scientists, and IT architects to align semantic models with scientific workflows and data standards Map internal data sources to public ontologies and standards to ensure FAIR (Findable, Accessible, Interoperable, Reusable) data principles Leverage semantic web technologies and ontology tools to build knowledge representation frameworks Participate in ontology alignment, reasoning, and validation processes to ensure quality and logical consistency Document semantic assets, relationships, and governance policies to support internal education and external compliance Deliverables Domain-specific ontologies representing concepts such as drug discovery (e-g , compounds, targets, assays), preclinical and clinical studies, biomarkers, adverse events, pharmacokinetics / dynamics, mechanisms of action, and disease models built using OWL/RDF and aligned with public standards Controlled vocabularies & taxonomies for experimental conditions, cell lines, compound classes, endpoints, clinical trial protocols, etc Semantic data models supporting the integration of heterogeneous data sources (e-g , lab systems, clinical trial data, external databases) Knowledge graphs or knowledge maps for semantic integration of structured data from internal R&D systems Mappings to public ontologies, standards, and external knowledge bases like: CDISC, MedDRA, LOINC, UMLS, SNOMED CT, RxNorm, UniProt, DrugBank, PubChem, NCBI Ontology documentation & governance artifacts, including ontology scope, design rationale, versioning documentation, and usage guidelines for internal stakeholders Validation reports and consistency checks, including outputs from reasoners or SHACL validation to ensure logical coherence and change impact assessments when modifying existing ontologies Training and stakeholder support materials: slide decks, workshops, and tutorials on using ontologies in data annotation, integration, and search Support for application developers embedding semantic layers About You Experience: 5+ years of experience in ontology engineering, data management, data analysis, data architecture, or another related field Proven experience in ontology engineering, Proven experience in ontology development within the biomedical or pharmaceutical domain Experience working with biomedical ontologies and standards (e-g , GO, BAO, EFO, ChEBI, NCBI Taxonomy, NCI Thesaurus, etc ) Familiarity with controlled vocabulary curation and knowledge graph construction Demonstrated ability to understand end-to-end data use and business needs Knowledge and/or experience of Pharma R&D or life sciences data and data domains Understanding of FAIR data principles, data governance, and metadata management Strong analytical problem-solving skills Demonstrated strong attention to detail, quality, time management and customer focus Excellent written and oral communication skills Strong networking, influencing, and negotiating skills and superior problem-solving skills Demonstrated willingness to make decisions and to take responsibility for such Excellent interpersonal skills (team player) Knowledge and experience in ontology engineering and maintenance are required Knowledge and experience with OWL, RDF, SKOS, and SPARQL Familiarity with ontology engineering tools (e-g , Protg, CENtree, TopBraid Composer PoolParty), Familiarity with ontology engineering methodologies (e-g , NeOn, METHONTOLOGY, Uschold and King, Gr?ninger and Fox, etc ) Knowledge and experience in data modeling are highly desired Experience with pharma R&D platforms, requirements gathering, system design, and validation/quality/compliance requirements Experience with hierarchical data models from conceptualization to implementation, bachelors in computer science, Information Science, Knowledge Engineering, or related; Masters or higher preferred Languages: English null
Posted 2 months ago
5.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are seeking an experienced ETL Data Engineer with expertise in Informatica Intelligent Cloud Services (IICS) and Informatica PowerCenter to support our ongoing and upcoming projects. The ideal candidate will be responsible for designing, developing, and maintaining data integration processes using both IICS and PowerCenter. Proficiency in Oracle is essential, including hands-on experience in building, optimizing, and managing data solutions on the platform. The candidate should have the ability to handle tasks independently , demonstrating strong problem-solving skills and initiative in managing data integration projects. This role involves close collaboration with business stakeholders, data architects, and cross-functional teams to deliver effective data solutions that align with business objectives. Who you are: Basics Qualification: Education: Bachelors in computer/ IT or Similar Mandate Skills: ETL Data Engineer, IICS, Informatica PowerCenter, Nice to have: Unix
Posted 2 months ago
8.0 - 13.0 years
9 - 13 Lacs
Bengaluru
Work from Office
As a Sr Data Engineer in the Digital & Data team you will work hands-on to deliver and maintain the pipelines required by the business functions to derive value from their data For this, you will bring data from a varied landscape of source systems into our cloud-based analytics stack and implement necessary cleaning and pre-processing steps in close collaboration with our business customers Furthermore, you will work closely together with our teams to ensure that all data assets are governed according to the FAIR principles To keep the engineering team scalable, you and your peers will create reusable components, libraries, and infrastructure that will be used to accelerate the pace with which future use-cases can be delivered You will be part of a team dedicated to delivering state-of-the-art solutions for enabling data analytics use cases across the Healthcare sector of a leading, global Science & Technology company As such, you will have the unique opportunity to gain insight into our diverse business functions allowing you to expand your skills in various technical, scientific, and business domains Working in a project-based way covering a multitude of data domains and technological stacks, you will be able to significantly develop your skills and experience as a Data Engineer Who you are BE/M.Sc./PhD in Computer Science or related field and 8+ years of work experience in a relevant capacity Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure. Experience with enforcing security controls and best practices to protect sensitive data within AWS data pipelines, including encryption, access controls, and auditing mechanisms. Agile mindset, a spirit of initiative, and desire to work hands-on together with your team Interest in solving challenging technical problems and developing the future data architecture that will enable the implementation of innovative data analytics use-cases Experience in leading small to medium-sized team. Experience in creating architectures for ETL processes for batch as well as streaming Ingestion Knowledge of designing and validating software stacks for GxP relevant contexts as well as working with PII data Familiarity with the data domains covering the Pharma value-chain (e.g. research, clinical, regulatory, manufacturing, supply chain, and commercial) Strong, hands-on experience in working with Python, Pyspark & R codebases, proficiency in additional programming languages (e.g. C/C++, Rust, Typescript, Java, ) is expected. Experience working with Apache Spark and the Hadoop ecosystem Working with heterogenous compute environments and multi-platform setups Experience in working with cloud environments such as, Hadoop, AWS, GCP, and Azure Basic knowledge of Statistics and Machine Learning algorithms is favorable This is the respective role description: The ability to easily find, access, and analyze data across an organization is key for every modern business to be able to efficiently make decisions, optimize processes, and to create new business models. The Data Architect plays a key role in unlocking this potential by defining and implementing a harmonized data architecture for Healthcare.
Posted 2 months ago
15.0 - 20.0 years
25 - 32 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Role & responsibilities 15+ years' experience in handling projects ab-initio. He / She must have Strong Technical experience with Microsoft technologies .Net, MS-SQL Server, TFS, Windows Server, BizTalk etc. The candidates should. have strength in technology, domain, and application development and possess leadership qualities to lead a team of minimum 40-50 professionals. Responsibility Areas:- Provide leadership role in the areas of advanced data techniques, including data quality, data governance, data modeling, data access, data integration, data visualization, data discovery,, database design and. implementation; Lead the overall strategy and.. Roadmap for data architecture. Partner with the project organization, solution architecture, and engineering to ensure best use of standards for the key data use cases I patterns tech standards. Analyze Information Technology landscape to identify gaps and recommend improvements. Create and maintain the Enterprise Data Model at the 'Conceptual, Logical and Physical Level: Steward of Enterprise Metadata Architecture & Standards, Data Lifecycle Management including data quality, data conversion, and data. security technologies. Define and achieve the strategy roadmap - for the enterprise data; including data modeling, implementation and data Management for our enterprise data, warehouse and advanced data analytics systems. Develop and document enterprise data standards and provides technical oversight on projects to ensure compliance through the adoption and promotion of industry standards / best practice guiding principles aligned with Gartner, TOGA:F, :Forrester and the like. Create architectural technology and business roadmaps that result in stronger business/VI-alignment and drive adoption and usage of technology across the . enterprise. Align portfolio of projects: to the roadmaps and reference architecture. Define and enforce architecture principles, standards, metrics and policies. Provide leadership in architecture, design and build of complex applications and perform architectural design reviews. Manage the development of transition plans for moving from • the current to the future state environment across application portfolio: Collaborate with both IT and business to influence decisions in technology investments. Evaluate data models and physical databases for variances and discrepancies. Validate business data objects for accuracy and completeness. Analyze data-related system integration challenges and propose appropriate solutions. Support SYstem.* Analysts, Engineers, Programmers and others on project limitations and capabilities, performance requirements and interfaces. Support modifications to existing software to improve efficiency and performance. Professional Qualification:- B.Tech/ BE/ MCA/ M.Tech/ ME/ Phd in Computer Science/Information. Technology (IT) and related fields or equivalent with consistently good academic record. Preferred Professional Qualification/Certification PMP and- Equivalent, CGEFT, Mt (Foundation), PM Tool, Microsoft certifications MS-SQL, BizTalk, Net Interested candidates share your resume at parul@mounttalent.com/ parul.s@mounttalent.com.
Posted 2 months ago
7.0 - 12.0 years
15 - 25 Lacs
Chennai
Work from Office
Years of Experience 7+ Years Purpose •The candidate is responsible for designing, creating, deploying, and maintaining an organization's data architecture. •To ensure that the organization's data assets are managed effectively and efficiently, and that they are used to support the organization's goals and objectives. •Responsible for ensuring that the organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Key Responsibilities Responsibilities will include but will not be restricted to: •Responsible for designing and implementing a data architecture that supports the organization's business goals and objectives. •Developing data models, defining data standards and guidelines, and establishing processes for data integration, migration, and management. •Create and maintain data dictionaries, which are a comprehensive set of data definitions and metadata that provide context and understanding of the organization's data assets. •Ensure that the data is accurate, consistent, and reliable across the organization. This includes establishing data quality metrics and monitoring data quality on an ongoing basis. •Organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. •Work closely with other IT professionals, including database administrators, data analysts, and developers, to ensure that the organization's data architecture is integrated and aligned with other IT systems and applications. •Stay up to date with new technologies and trends in data management and architecture and evaluate their potential impact on the organization's data architecture. •Communicate with stakeholders across the organization to understand their data needs and ensure that the organization's data architecture is aligned with the organization's strategic goals and objectives. Technical requirements •Bachelor's or master’s degree in Computer Science or a related field. •Certificates in Database Management will be preferred. •Expertise in data modeling and design, including conceptual, logical, and physical data models, and must be able to translate business requirements into data models. •Proficient in a variety of data management technologies, including relational databases, NoSQL databases, data warehouses, and data lakes. •Expertise in ETL processes, including data extraction, transformation, and loading, and must be able to design and implement data integration processes. •Experience with data analysis and reporting tools and techniques and must be able to design and implement data analysis and reporting processes. •Familiar with industry-standard data architecture frameworks, such as TOGAF and Zachman, and must be able to apply them to the organization's data architecture. •Familiar with cloud computing technologies, including public and private clouds, and must be able to design and implement data architectures that leverage cloud computing. Qualitative Requirements •Able to effectively communicate complex technical concepts to both technical and non-technical stakeholders. •Strong analytical and problem-solving skills. •Must be able to inspire and motivate their team to achieve organizational goal. Following skills can be deemed good to have but not necessary: Databricks, Snowflake, Redshift, Data Mesh, Medallion, Lambda
Posted 2 months ago
2.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. — You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. — Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years"™ experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment
Posted 2 months ago
12.0 - 15.0 years
45 - 55 Lacs
Bengaluru
Work from Office
Join us as a Solution Designer Take on a varied role, where you ll own the end-to-end high-level business design for a project, programme or initiative You ll be working with a range of stakeholders to identify investment priorities, define opportunities and shape journeys to meet our strategic goals This is a chance to shape the future of our business and gain great exposure across the bank in the process Were offering this role at vice president level What youll do As a Solution Designer, you ll engage with relevant stakeholders as a single point of contact for design aspects. You ll be representing the design function at governance forums and working with enterprise architects to make sure standards and principles are adhered to. You ll also analyse requirements into coherent end-to-end designs, taking the business architecture into account. Other duties include: Translating requirements into a series of transition state designs and an executable roadmap Partnership with technology and data to develop a data product roadmap to support customer and reference data outcomes Documenting the relevant design in accordance with standard methods Designing systems and processes supporting data quality issue management across customer and reference data optimising for data quality remediation where possible The skills youll need You ll already have a background in solution design and experience of minimum ten years of using industry standard models and tools. Alongside good communication skills, you ll also need the ability to lead and collaborate with both internal and external teams. We ll also want to see: Knowledge of cloud data practices and data architecture A broad understanding of d ata lakehouse solutions like SageMaker in implementing effective data management practices Creative skills to design solutions to support the bank wide simplification program for customer and reference data Hours 45 Job Posting Closing Date: 01/07/2025
Posted 2 months ago
8.0 - 13.0 years
8 - 12 Lacs
Hyderabad
Work from Office
We are seeking a seasoned Data Engineering Manager with 8+ years of experience to lead and grow our data engineering capabilities. This role demands strong hands-on expertise in Python, SQL, Spark , and advanced proficiency in AWS and Databricks . As a technical leader, you will be responsible for architecting and optimizing scalable data solutions that enable analytics, data science, and business intelligence across the organization. Key Responsibilities: Lead the design, development, and optimization of scalable and secure data pipelines using AWS services such as Glue, S3, Lambda, EMR , and Databricks Notebooks , Jobs, and Workflows. Oversee the development and maintenance of data lakes on AWS Databricks , ensuring performance and scalability. Build and manage robust ETL/ELT workflows using Python and SQL , handling both structured and semi-structured data. Implement distributed data processing solutions using Apache Spark / PySpark for large-scale data transformation. Collaborate with cross-functional teams including data scientists, analysts, and product managers to ensure data is accurate, accessible, and well-structured. Enforce best practices for data quality, governance, security , and compliance across the entire data ecosystem. Monitor system performance, troubleshoot issues, and drive continuous improvements in data infrastructure. Conduct code reviews, define coding standards, and promote engineering excellence across the team. Mentor and guide junior data engineers, fostering a culture of technical growth and innovation. Requirements 8+ years of experience in data engineering with proven leadership in managing data projects and teams. Expertise in Python, SQL, Spark (PySpark) ,
Posted 2 months ago
12.0 - 17.0 years
15 - 20 Lacs
Gurugram, Bengaluru
Work from Office
Join us as a Solution Architect This is an opportunity for an experienced Solution Architect to help us define the high level technical architecture and design for your assigned scope that provides solutions to deliver great business outcomes and meets our longer term strategy You ll define and communicate a shared technical and architectural vision of end-to-end designs that may span multiple platforms and domains Take on this exciting new challenge and hone your technical capabilities while advancing your career and building your network across the bank Were offering this role at vice president level What youll do We ll look to you to influence and promote the collaboration across platform and domain teams on the solution delivery. Partnering with platform and domain teams, you ll elaborate the solution and its interfaces, validating technology assumptions, evaluating implementation alternatives, and creating the continuous delivery pipeline. You ll also provide analysis of options and deliver end-to-end solution designs using the relevant building blocks, as well as producing designs for features that allow frequent incremental delivery of customer value. On top of this, you ll be: Owning the technical design issues and driving resolution through the iteration of the technical solution design Participating in activities to shape requirements, validating designs and prototypes to deliver change that aligns with the target architecture Promoting adaptive design practices to drive collaboration of feature teams around a common technical vision using continuous feedback Making recommendations of potential impacts to existing and prospective customers of the latest technology and customer trends The skills youll need As a Solution Architect, you ll bring expert knowledge of application architecture, and in business data or infrastructure architecture with working knowledge of industry architecture frameworks such as TOGAF or ArchiMate. You ll also need an understanding of Agile and contemporary methodologies with experience of at least 12 years of working in Agile teams. On top of this, you ll bring: Experience of data engineering and designing solutions that involve complex data supply chains and platforms A background in delivering solutions that securely span a complex infrastructure domain Experience in AWS and diagramming using tools such as Draw.io and MS Visio Knowledge and understanding of the key concepts in data management and data architecture The ability to communicate complex technical concepts clearly to peers and leadership level colleagues Hours 45 Job Posting Closing Date: 01/07/2025
Posted 2 months ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous. Other Languages English: C1 Advanced Location - Pune,Bangalore,Hyderabad,Chennai,Noida
Posted 2 months ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Should have 4+ yrs of MDM development & implementation experience. Reltio MDM development OR any other MDM development (preferentially Reltio), SQL. Must have core skills experience with Informatica MDM Hands-on experience with Informatica MDM hub configurations, data modelling, data mappings, and data validation. Well-versed with best practices driven design and development including match rule tuning, strong ability to understand, document, and communicate technical architectures, standards, and toolsets. Knowledge of creating set-up security for applications. Providing data architecture solutions, interpreting business requirements, and converting them into technical requirements. Defining use cases and testing scenarios. Collaborating with source systems data stewards, system owners, and technical personnel for data governance. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance
Posted 2 months ago
5.0 - 10.0 years
2 - 6 Lacs
Pune
Work from Office
Job Title: Support Specialist - Eagle Platform (Portfolio Management) Location: Riyadh, Saudi Arabia Type: Full-time / Contract Industry: Banking / Investment Management / FinTech Experience Required: 5+ years We are seeking a highly skilled Support Specialist with hands-on experience working on BNY Mellon s Eagle Investment Systems , particularly the Eagle STAR, PACE, and ACCESS modules used for portfolio accounting, data management, and performance reporting . The ideal candidate will have supported the platform in banking or asset management environments, preferably with experience at Bank of America , BNY Mellon , or institutions using Eagle for middle- and back-office operations . Key Responsibilities: Provide day-to-day technical and functional support for the Eagle Platform including STAR, PACE, and Performance modules Troubleshoot and resolve user issues related to portfolio accounting, performance calculation, and reporting Act as a liaison between business users and technical teams for change requests, data corrections, and custom reports Monitor batch jobs, data feeds (security, pricing, transaction data), and system interfaces Work closely with front-office, middle-office, and operations teams to ensure accurate data processing and reporting Manage SLA-driven incident resolution and maintain support documentation Support data migrations, upgrades, and new release rollouts of Eagle components Engage in root cause analysis and implement preventive measures Required Skills and Experience: 5+ years of experience in financial systems support, with a strong focus on Eagle Investment Systems Strong knowledge of portfolio management processes , NAV calculations , and financial instruments (equities, fixed income, derivatives) Prior work experience in Bank of America , BNY Mellon , or with asset managers using Eagle is highly preferred Proficient in SQL , ETL tools , and understanding of data architecture in financial environments Familiarity with upstream/downstream systems such as Bloomberg, Aladdin, or CRD is a plus Strong analytical skills and attention to detail Excellent communication skills in English (Arabic is a plus) Preferred Qualifications: Bachelor s degree in Computer Science, Finance, or related field ITIL Foundation or similar certification in service management Prior experience working in a banking or asset management firm in the GCC is a bonus
Posted 2 months ago
12.0 - 17.0 years
10 - 13 Lacs
Pune
Work from Office
You will provide effective leadership by influence in a manner that is consistent with the Roche values and leadership commitments. You will seek to inspire and influence teams to create transformative solutions ensuring that Roche products are recognised as being the best in the industry and maintain our #1 ranking in the future. Reporting to the Technology Architecture Subchapter Lead in the Architecture, Technology and Standards Office, you will primarily partner with the rest of Engineering across Roche to deliver customer centric solutions. You will be responsible Responsible to technically lead a large scale product portfolio or a specific clinical domain, defining the technology strategy to sustainably deliver a product line Act as a subject matter expert for their domain / area, mentoring and guiding the team Responsible for developing and delivering new designs, including identifying and assessing technology options (build, buy, partner) Accountable for common architecture roadmap and onboarding, common assets oversight, Toolkit and Integration APIs Influence and engage internal customers across Roche and drive fast and consistent adoption of the reference architecture Publicize and develop a collaboration model for reference architecture adoption at all levels of the organization Responsible for standardization, Tools, App Development Environments Proactively takes on improvement initiatives and leads process improvements (Agile, QMS alignment) Able to scale strategically to take on multiple projects, align across Roche portfolio, build and drive a technology roadmap, manage cross-functional and internal (Roche) and external stakeholders Your profile BS degree or equivalent in a directly related discipline (CS, Eng, etc.) 12+ years of previous software development/architecture experience Have successfully built, deployed, and supported an enterprise-scale (web) application in the cloud (in a leadership role) Deep healthcare experience in at least 1 healthcare domain, with breadth across several Experience in the role of an architect, leading a technical team, designing large software systems, and operating at the engineering management team level Hands-on software development experience Quick learner with the ability to understand complex workflows and develop and validate innovative solutions to solve difficult problems Good communicator, able to talk with stakeholders and customers to explain technology and with the proven ability to take insights from customers and translate them into technical deliverables Proven ability to establish and articulate a vision, set goals, develop and execute strategies, and track and measure results Proven experience leading software teams through collaborative technological innovation in an agile environment or continuous improvement efforts that have yielded tangible results and/or positive impact for patients or business stakeholders Highly developed people influencing skills; Demonstrated success in establishing a high performing environment, with an excellent reputation attracting the best talent and the commitment to developing and inspiring them Proven ability to create and sustain strong collaborative relationships and networks with diverse stakeholders across a complex global organization Familiarity with the technological trends and their relevance to the healthcare industry. A passionate and decisive business leader. Demonstrating courage, vision and drive to achieve results at the forefront of innovative technological changes Locations You will be based in Pune, India. At the Companys discretion, an exception to the location requirement could be made under extraordinary circumstances. As this position is a global role, international business travel will be required depending upon the business location of the successful candidate and ongoing business project activities. Roche is strongly committed to a diverse and inclusive workplace. We strive to build teams that represent a range of backgrounds, perspectives, and skills. Embracing diversity enables us to create a great place to work and to innovate for patients. Roche is an equal opportunity employer.
Posted 2 months ago
3.0 - 7.0 years
6 - 15 Lacs
Pune, Chennai
Hybrid
Company Description: Volante is on the Leading Edge of Financial Services technology, if you are interested to be on an Innovative fast- moving team that leverages the very best in Cloud technology our team may be right for you. By joining the product team at Volante, you will have an opportunity to shape the future of payments technology, with focus on payment intelligence. We are a financial technology business that provides a market leading, cloud native Payments Processing Platform to Banks and Financial institutions globally. Education Criteria: • B.E, MSc, M.E/MS in Computer Science or similar major. Relevant certification courses from reputed organization. Experience of 3+ years as a Data Engineer Responsibilities: • The role involves design and development of scalable solutions, payment analytics unlocking operational and business insight • Own data modeling, building ETL pipelines and enabling data driven metrics • Build and optimize data models for our application needs • Design & develop data pipelines and workflows that integrate data sources (structured, unstructured data) across the payment landscape • Assess customer's data infrastructure landscape (payment ancillary systems including Sanctions, Fraud, AML) across cloud environments like AWS, Azure as well as on-prem, for deployment design • Lead the enterprise application data architecture design, framework & services plus Identify and enable the services for SaaS environment in Azure and AWS • Implement customizations and data processing required to transform customer datasets that is needed for processing in our analytics framework/BI models • Monitor data processing, machine learning workflows to ensure customer data is successfully processed by our BI models, debugging and resolving any issues faced along the way • Optimize queries, warehouse, data lake costs • Review and provide feedback on Data Architecture Design Document/HLD for our SaaS application • Cross team collaboration to successfully integrate all aspects of the Volante PaaS solution • Mentor to the development team Skills: • 3+ years of data engineering experience data collection, preprocessing, ETL processes and Analytics • Proficiency in data engineering Architecture, Metadata management, Analytics, reporting and database administration • Strong in SQL/NoSQL, Python, JSON and data warehousing/data lake , orchestration, analytical tools • ETL or pipeline design & implementation of large data • Experience with data technologies, frameworks like Databricks, Synapse, Kafka, Spark, Elasticsearch • Knowledge of SCD, CDC, core data warehousing to develop a cost-effective, secure data collection, storage, and distribution of data for SaaS application • Experience in application deployment in AWS or Azure w/container, Kubernetes • Strong problem-solving skills and passion for building data at scale Job Description Engineering Skills (Desirable): • Knowledge of data visualization tools like Tableau • ETL Orchestration tools like Airflow and visualization tools like Grafana • Prior experience in Banking or Payments domain Location: India (Pune or Chennai)
Posted 2 months ago
9.0 - 13.0 years
32 - 40 Lacs
Ahmedabad
Remote
About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.
Posted 2 months ago
2.0 - 6.0 years
0 - 1 Lacs
Pune
Work from Office
As Lead Data Engineer , you'll design and manage scalable ETL pipelines and clean, structured data flows for real-time retail analytics. You'll work closely with ML engineers and business teams to deliver high-quality, ML-ready datasets. Responsibilities: Develop and optimize large-scale ETL pipelines Design schema-aware data flows and dashboard-ready datasets Manage data pipelines on AWS (S3, Glue, Redshift) Work with transactional and retail data for real-time insights
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40419 Jobs | Dublin
Wipro
19673 Jobs | Bengaluru
Accenture in India
18234 Jobs | Dublin 2
EY
16675 Jobs | London
Uplers
12161 Jobs | Ahmedabad
Amazon
10909 Jobs | Seattle,WA
Accenture services Pvt Ltd
10500 Jobs |
Bajaj Finserv
10207 Jobs |
Oracle
9771 Jobs | Redwood City
IBM
9641 Jobs | Armonk