Jobs
Interviews

2367 Data Architecture Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Do you want to help solve the world's most pressing challengesFeeding the world's growing population and slowing climate change are two of the world's greatest challenges AGCO is a part of the solution! Join us to make your contribution AGCO is looking to hire candidates for the position of Senior Manager, AI & Data Systems Architecture We are seeking an experienced and innovative Senior Manager, AI & Data Systems Architecture to lead the design, creation, and evolution of system architectures for AI, analytics, and data systems within our organization The ideal candidate will have extensive experience delivering scalable, high-performance data and AI architectures across cloud platforms such as AWS, Google Cloud Platform, and Databricks, with a proven ability to align technology solutions with business goals This individual will collaborate with cross-functional teams, including data engineers, data scientists, and other IT professionals to create architectures that support cutting-edge AI and data initiatives, driving efficiency, scalability, and innovation Your Impact Architecture Leadership: Lead the end-to-end architecture for AI and data systems, ensuring cost-effective scalability, performance, and security across cloud and on-premises environments The goal is to build and support a modern data stack AI & Data Systems: Design, implement, and manage data infrastructure and AI platforms, including but not limited to AWS, Azure, Google Cloud Platform, Databricks, and other key data tools Lead the data model approach for all data products and solutions Cloud Expertise: Champion cloud adoption strategies, optimizing data pipelines, analytics workloads, and AI/ML model deployment, end point creation and app integration System Evolution: Drive the continuous improvement and evolution of data and AI architectures to meet emerging business needs, technological advancements, and industry trends Collaboration & Leadership: Work closely with delivery teams, data engineers, data scientists, software engineers, and IT operations to implement comprehensive data architectures that support AI and analytics initiatives focused on continuous improvement Strategic Vision: Partner with business and technology stakeholders to understand long-term goals, translating them into architectural frameworks and roadmaps that drive business value Governance & Best Practices: Ensure best practices in data governance, security, and compliance, overseeing the implementation of standards across AI and data systems Performance Optimization: Identify opportunities to optimize performance, cost-efficiency, and operational effectiveness of AI and data systems including ETL, ELT and data pipeline creation and evolution and optimizing of AI resource models Functional Knowledge Experience: 10+ years of experience in data architecture, AI systems, or cloud infrastructure, with at least 3-5 years in a leadership role Proven experience driving solutions from ideation to delivery and support Cloud Expertise: Deep hands-on experience with cloud platforms like AWS, Google Cloud Platform (GCP), and Databricks Familiarity with other data and AI platforms is a plus CRM Expertise: Hands-on experience with key CRM systems like Salesforce and AI systems inside of those solutions (ex Einstein) AI & Analytics Systems: Proven experience designing architectures for AI, machine learning, analytics, and large-scale data processing systems Technical Knowledge: Expertise in data architecture, including data lakes, data warehouses, real-time data streaming, and batch processing frameworks Cross-Platform Knowledge: Solid understanding of containerization (Docker, Kubernetes), infrastructure as code (Terraform, CloudFormation), and big data ecosystems (Spark, Hadoop) Experience in applying Agile methodologies, including Scrum, Kanban or SAFe Experience in top reporting solutions, including/preferred Tableau which is one of our cornerstone reporting solutions Leadership: Strong leadership and communication skills, with the ability to drive architecture initiatives in a collaborative and fast-paced environment Excellent problem solving skills and a proactive mindset Education: Bachelors degree in Computer Science, Data Science, or related field Masters degree or relevant certifications (e g , AWS Certified Solutions Architect) is preferred Business Expertise Experience in industries such as manufacturing, agriculture, or supply chain, particularly in AI and data use cases Familiarity with regulatory requirements related to data governance and security Experience with emerging technologies like edge computing, IoT, and AI/ML automation tools Your Experience And Qualifications Excellent communication / interpersonal skills, capable of interacting with multiple levels of IT and business management/leadership Hands on experience with SAP Hana, SAP Data Services or similar data storage, warehousing and/or ETL solutions 10+ years of progressive IT experience Experience creating data models, querying data, business process and technical process mapping Successfully influences diverse groups and teams in a complex, ambiguous and rapidly changing environment to deliver value-added solutions Effective working relationship with the business to ensure business requirements are accurately captured, agreed, and accepted Adaptable to new technologies/practices and acts as change agent within teams Your Benefits GLOBAL DIVERSITY Diversity means many things to us, different brands, cultures, nationalities, genders, generations even variety in our roles You make us unique! ENTERPRISING SPIRITEvery role adds value We're committed to helping you develop and grow to realize your potential POSITIVE IMPACT Make it personal and help us feed the world INNOVATIVE TECHNOLOGIES You can combine your love for technology with manufacturing excellence and work alongside teams of people worldwide who share your enthusiasm MAKE THE MOST OF YOU Benefits include health care and wellness plans and flexible and virtual work option??? Your Workplace AGCO is Great Place to Work Certified and has been recognized for delivering exceptional employee experience and a positive workplace culture We value inclusion and recognize the innovation a diverse workforce delivers to our farmers Through our recruiting, we are committed to building a team that includes a variety of experiences, backgrounds, cultures and perspectives We value inclusion and recognize the innovation a diverse workforce delivers to our farmers Through our recruitment efforts, we are committed to building a team that includes a variety of experiences, backgrounds, cultures and perspectives Join us as we bring agriculture into the future and apply now! Please note that this job posting is not designed to cover or contain a comprehensive listing of all required activities, duties, responsibilities, or benefits and may change at any time with or without notice AGCO is proud to be an Equal Opportunity Employer

Posted 2 months ago

Apply

1.0 - 3.0 years

16 - 19 Lacs

Bengaluru

Work from Office

About The Position Chevron invites applications for the role of Cloud Engineer Data Hosting within our team in India This position supports Chevrons data hosting environment by delivering modern digital data hosting capabilities in a cost competitive, reliable, and secure manner This position will provide broad exposure to the application of technology to enable business with many opportunities for growth and professional development for the candidate Key Responsibilities Design, implement, and manage scalable and secure data hosting solutions on Azure Develop and maintain data architectures, including data models, data warehouses, and data lakes Refine data storage and extraction procedures to enhance performance and cost-effectiveness Uphold stringent data security measures and ensure adherence to relevant industry standards and regulatory requirements Collaborate with data scientists, analysts, and other stakeholders to understand and address their data needs Monitor and troubleshoot data hosting environments to ensure high availability and reliability Streamline data workflows and operations through the automation capabilities of Azure Data Factory and comparable technologies Design, develop, and deploy modular cloud-based systems Develop and maintain cloud solutions in accordance with best practices Required Qualifications Must have bachelors degree in computer science engineering or related discipline 0-5 years' experience At least 2 years of experience in data hosting for both on-premises and azure environments Microsoft AZ900 Certification Proficient in utilizing Azure data services, including Azure SQL Database, Azure Data Lake Storage, and Azure Data Factory In-depth understanding of cloud infrastructure, encompassing virtual networks, storage solutions, and compute resources within Azure Extensive hands-on experience with Azure services such as Azure SQL Database, Azure Blob Storage, Azure Data Lake, and Azure Synapse Analytics Well-versed in on-premises storage systems from vendors like NetApp, Dell, and others Skilled proficiency in scripting languages like Ansible, PowerShell, Python, and Azure CLI for automation and management tasks Comprehensive knowledge of Azure security best practices, including identity and access management, encryption, and compliance standards Preferred Qualifications Demonstrated proficiency in architecting, deploying, and managing secure and scalable data hosting solutions on the Azure platform Extensive experience in developing and maintaining robust data architectures, including data models, data warehouses, and data lakes, utilizing Azure services Expertise in optimizing data storage and retrieval processes for superior performance and cost efficiency within Azure environments In-depth knowledge of data security protocols and compliance with industry standards and regulations, with a focus on Azure cloud compliance Proven ability to collaborate effectively with data scientists, analysts, and other stakeholders to address their data needs using Azure's capabilities Strong track record of monitoring and troubleshooting Azure data hosting environments to ensure high availability and system reliability Skilled in automating data workflows and processes using Azure Data Factory and other Azure-based automation tools Experience in designing, developing, and deploying modular, cloud-based systems, with a particular emphasis on Azure solutions Commitment to maintaining cloud solutions in alignment with Azure best practices and continuously integrating Azure's latest updates and features Possession of Azure certifications, such as the Azure Data Engineer Associate or Azure Database Administrator Associate, with a preference for candidates holding the Azure Solutions Architect Expert certification or equivalent advanced credentials Chevron ENGINE supports global operations, supporting business requirements across the world Accordingly, the work hours for employees will be aligned to support business requirements The standard work week will be Monday to Friday Working hours are 8:00am to 5:00pm or 1 30pm to 10 30pm Chevron participates in E-Verify in certain locations as required by law

Posted 2 months ago

Apply

2.0 - 6.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Job Title: Data Governance & Management Associate Location: Bangalore, India Role Description The Compliance and Anti-Financial Crime (CAFC) Data Office is responsible for Data Governance and Management across key functions including AFC, Compliance, and Legal. The team supports these functions in establishing and improving data governance to achieve critical business outcomes such as effective control operation, regulatory compliance, and operational efficiency. The CAFC Data Governance and Management team implements Deutsche Banks Enterprise Data Management Frameworkfocusing on controls, culture, and capabilitiesto drive improved data quality, reduce audit and regulatory findings, and strengthen controls. As a member of the Divisional Data Office, the role holder will support both Run-the-Bank and Change-the-Bank initiatives, with a particular focus on Financial Crime Risk Assessment (FCRA) data collation, processing, testing, and automation. Your key responsibilities Document and maintain existing and new processes; respond to internal and external audit queries and communicate updates clearly to both technical and non-technical audiences. Independently manage the FCRA data collection process including data collection template generation, quality checks, and stakeholder escalation. Execution of data cleansing and transformation tasks to prepare data for analysis. Perform variance analysis and develop a deep understanding of underlying data sources used in Financial Crime Risk Assessment. Documentation of data quality findings and recommendations for improvement/feeding into the technology requirements. Work with Data Architecture & developers to design and build data FCRA Risk data metrics. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification. To ensure new data sources align with Deutsche Banks Data Governance standards, maintain metadata in Collibra, visualize data lineage in Solidatus, and ensure certification and control coverage. Automate manual data processes using tools such as Python, SQL, Power Query and MS excel to improve efficiency and reduce operational risk. Translate complex technical issues into simple, actionable insights for business stakeholders, demonstrating strong communication and stakeholder management skills. Your skills and experience 6+ years of experience in data management within financial services, with a strong understanding of data risks and controls. Familiarity with industry-standard frameworks such as DCAM or DAMA (certification preferred). Hands-on experience with Data cataloguing using Collibra, Data lineage documentation using Solidatus and Data control assessment and monitoring Proficiency in Python, SQL, and Power Query/excel for data analysis and automation. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively across global teams.

Posted 2 months ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Pune

Work from Office

Step into role of a Senior Data Engineer At Barclays, innovation isnt encouraged, its expected As a Senior Data Engineer you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave

Posted 2 months ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Job Summary We are seeking a skilled Escalation Engineer with expertise in NetApp ONTAP, data center operations, and storage concepts The ideal candidate will possess a robust technical background in data storage, coupled with extensive experience in providing technical support and leading teams in resolving complex issues This role requires a deep understanding of product sustainability, engineering cycles, and a commitment to delivering exceptional customer service Job Requirements Serve as a subject matter expert in NetApp ONTAP and related storage technologies Lead and coordinate resolution efforts for escalated technical issues, collaborating closely with cross-functional teams Provide advanced troubleshooting and problem-solving expertise to address complex customer issues Conduct in-depth analysis of customer environments to identify root causes and develop effective solutions Actively participate in product sustainability initiatives, including product lifecycle management and engineering cycles Mentor and guide junior team members, fostering a culture of continuous learning and development Communicate effectively with customers, internal stakeholders, and management, both verbally and in writing Document technical solutions, best practices, and knowledge base articles to enhance team efficiency and customer satisfaction Education & Requirements Bachelors degree in Computer Science, Information Technology, or related field Extensive experience for 10+ years in technical support as a Sr Engineer/Principal Engineer, handling escalations preferably in a storage or data center environment In-depth knowledge of NetApp ONTAP and storage concepts such as SAN, NAS, RAID, and replication Strong understanding of data center architectures, virtualization technologies, and cloud platforms Proven track record of leading teams in resolving technical escalations and driving issue resolution Excellent collaboration skills with the ability to work effectively in a cross-functional team environment Exceptional verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences Demonstrated ability to prioritize and manage multiple tasks in a fast-paced environment Relevant certifications such as NetApp Certified Implementation Engineer (NCIE) or equivalent are a plus At NetApp, we embrace a hybrid working environment designed to strengthen connection, collaboration, and culture for all employees This means that most roles will have some level of in-office and/or in-person expectations, which will be shared during the recruitment process Equal Opportunity Employer NetApp is firmly committed to Equal Employment Opportunity (EEO) and to compliance with all laws that prohibit employment discrimination based on age, race, color, gender, sexual orientation, gender identity, national origin, religion, disability or genetic information, pregnancy, and any protected classification Why NetApp We are all about helping customers turn challenges into business opportunity It starts with bringing new thinking to age-old problems, like how to use data most effectively to run better but also to innovate We tailor our approach to the customer's unique needs with a combination of fresh thinking and proven approaches We enable a healthy work-life balance Our volunteer time off program is best in class, offering employees 40 hours of paid time off each year to volunteer with their favourite organizations We provide comprehensive benefits, including health care, life and accident plans, emotional support resources for you and your family, legal services, and financial savings programs to help you plan for your future We support professional and personal growth through educational assistance and provide access to various discounts and perks to enhance your overall quality of life If you want to help us build knowledge and solve big problems, let's talk

Posted 2 months ago

Apply

3.0 - 8.0 years

11 - 30 Lacs

Gurugram

Work from Office

Mulesoft Architect at N Consulting Ltd | Jobs at N Consulting Ltd 11 LPA to 30 LPA Hi Jobseeker, We are hiring Mulesoft Architect for our MNC client. Location-Gurgaon Interview Mode- Virtual Experience- 7yrs to 15yrs Notice Period- only immediate to 30days Below I have mentioned the JD Key responsibilities of this role include: Support the design and evolution of scalable and reusable systems architecture across products and services. Collaborate with engineering, DevOps, product, compliance, and security teams to ensure solutions are robust and compliant. Translate business requirements into architectural designs that scale and adapt to multi-cloud environments and complex data landscapes. Act as a technical authority and guide for engineering teams, enforcing best practices in integration, data architecture, and performance. Provide technical direction on MuleSoft integrations and ensure robust API strategy and lifecycle management. Ensure architectural alignment with regulatory requirements Document architectural standards, patterns, and decision-making rationales. Evaluate emerging technologies and their fit with the business and technical ecosystem. The Requirements Critical 3+ years of experience as a Solutions Architect in a startup or scale-up environment. Proven experience in designing systems to support massive scale, high availability, and modularity. Strong backend engineering foundation with fluency in distributed systems, APIs, and service-oriented architectures, using asynchronous patterns. Hands-on experience with MuleSoft, including API-led connectivity, integration patterns, and Anypoint Platform. Deep familiarity with multi-cloud infrastructures (min. AWS & Azure) and cloud-native architectural principles. Deep familiarity with designing complex database integrations (specifically MongoDB). Demonstrated success in environments requiring strict regulatory compliance. Ability to manage the architectural landscape solo, making informed decisions and justifying trade-offs effectively. Experience with complex database integrations, covering both relational and non-relational databases at large scale. Preferred Practical experience with DevSecOps principles and CI/CD pipelines. Familiarity with containerisation (Kubernetes) and microservices patterns. Strong stakeholder communication and documentation skills. Experience mentoring or guiding development teams on architecture patterns and decisions. Comfort working in agile, cross-functional teams. Strong problem-solving skills with a pragmatic approach to design and implementation. Role-Specific Tools & Technologies Core Tools: MuleSoft, REST APIs, MongoDB. Cloud & Infrastructure: AWS, Azure, Terraform, Kubernetes Success In This Role Looks Like Delivery of robust, scalable architecture that supports rapid product ideation and delivery, without compromising performance or compliance. High developer velocity due to reusable and well-documented architectural patterns. Seamless integration between systems and data sources across a multi-cloud environment. Recognition as an architectural authority within the organisation, driving strategic technical decisions confidently. Interested candidates please share your resume to

Posted 2 months ago

Apply

10.0 - 20.0 years

40 - 50 Lacs

Bengaluru

Work from Office

10+ years of database experience MS-SQL, Hbase, Casandra, MongoDB etc. Knowledge on CEPH is mandatory Experience as a software developer or data architect or in a data management role Experience working with at least two, and expert-level knowledge of at least two database technologies like'Mysel, Postgres, MS sQL Server, vertica, Snowflake, Dynamoolg, MongoDB, DocumentDB, Map-R, Cassandra etc. Understanding of various relational and non-relational database technologies along with their benefits, downsides, and best use case willingness and commitment to learn other database, automation, and Cloud technologies Proficiency in automation Experience working with the databases in public clouds, preferably AWS Strong analytical skills Ability to perform system monitoring and address various issues in the system Ability to do performance tuning and database development Creating and maintaining high pa stable data architectures for the onto Innovation product suite Research new database methods and technologies to fully utilize platform feature s working with customer, development squads, and product management to identiff and document use case scenarios Lead on all data solution aspects including setting data standards and providing your deep technical expertise to development teams for best practices, systems, and architectures Design data architectures and solutions that are highly available and meet disaster recovery requirements Design effective data store solutions that account for effective capacity planning, datatiering, and data life-cycle management Design automated deployment procedures for on-premises and cloud work with QA to create detailed test plans and needed tooling to account targeted performance for production volume data new and existing database systems Participate in code reviews and create engineering and cross-functional practices Work with cross-functional teams to technical and business viability documentation for ensure end-to-end Work as the member of cross-platform Database Services team Lead data tier architecture and design processes, participate in system and application architecture Help with performance tuning and help with database devllopment of performance-critical code Architect and implement HA/DRlBackup/TVlaintenance sffategies Collaborate with development and business teams Be responsible for meeting various SLAs for multiple database platforms used Troubleshoot and address various issues in the systems Learn other database platforms and technologie

Posted 2 months ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Project Role : Security Delivery Practitioner Project Role Description : Assist in defining requirements, designing and building security components, and testing efforts. Must have skills : Informatica PowerCenter Good to have skills : Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Delivery Practitioner, you will assist in defining requirements, designing and building security components, and testing efforts. Your typical day will involve collaborating with various teams to ensure that security measures are effectively integrated into the project lifecycle. You will engage in discussions to understand security needs, contribute to the design of security frameworks, and participate in testing to validate the effectiveness of security solutions. Your role will be pivotal in ensuring that security considerations are embedded in all aspects of project delivery, fostering a culture of security awareness and compliance within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions to enhance team knowledge on security practices.- Monitor and evaluate the effectiveness of security measures implemented across projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Good To Have Skills: Experience with Python (Programming Language).- Strong understanding of data integration and ETL processes.- Experience with data quality and governance frameworks.- Familiarity with security compliance standards and best practices. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

10.0 - 15.0 years

7 - 11 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Description : We are seeking a highly skilled and motivated Data Cloud Architect to join our Product and technology team. As a Data Cloud Architect, you will play a key role in designing and implementing our cloud-based data architecture, ensuring scalability, reliability, and optimal performance for our data-intensive applications. Your expertise in cloud technologies, data architecture, and data engineering will drive the success of our data initiatives. Responsibilities: Collaborate with cross-functional teams, including data engineers, data leads, product owner and stakeholders, to understand business requirements and data needs. Design and implement end-to-end data solutions on cloud platforms, ensuring high availability, scalability, and security. Architect delta lakes, data lake, data warehouses, and streaming data solutions in the cloud. Evaluate and select appropriate cloud services and technologies to support data storage, processing, and analytics. Develop and maintain cloud-based data architecture patterns and best practices. Design and optimize data pipelines, ETL processes, and data integration workflows. Implement data security and privacy measures in compliance with industry standards. Collaborate with DevOps teams to deploy and manage data-related infrastructure on the cloud. Stay up-to-date with emerging cloud technologies and trends to ensure the organization remains at the forefront of data capabilities. Provide technical leadership and mentorship to data engineering teams. Qualifications: Bachelors degree in computer science, Engineering, or a related field (or equivalent experience). 10 years of experience as a Data Architect, Cloud Architect, or in a similar role. Expertise in cloud platforms such as Azure. Strong understanding of data architecture concepts and best practices. Proficiency in data modeling, ETL processes, and data integration techniques. Experience with big data technologies and frameworks (e.g., Hadoop, Spark). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with data warehousing solutions (e.g., Redshift, Snowflake). Strong knowledge of security practices for data in the cloud. Excellent problem-solving and troubleshooting skills. Effective communication and collaboration skills. Ability to lead and mentor technical teams. Additional Preferred Qualifications: Bachelors degree / Master's degree in Data Science, Computer Science, or related field. Relevant cloud certifications (e.g., Azure Solutions Architect) and data-related certifications. Experience with real-time data streaming technologies (e.g., Apache Kafka). Knowledge of machine learning and AI concepts in relation to cloud-based data solutions. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 2 months ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Mumbai

Work from Office

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Navi Mumbai

Work from Office

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure seamless integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to align the data architecture with business objectives, ensuring that the data platform meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with data governance frameworks.- Strong understanding of data modeling techniques.- Familiarity with cloud-based data storage solutions.- Experience in implementing data integration strategies. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Data Services.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Functional Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Manual Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects.- Ensure cohesive integration between systems and data models.- Implement data platform components.- Troubleshoot and resolve data platform issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data platform blueprint and design.- Experience with data integration and data modeling.- Hands-on experience with data platform components.- Knowledge of data platform security and governance. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Pune

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be responsible for analyzing requirements and translating them into effective data solutions, ensuring that the data platform meets the needs of various stakeholders. Additionally, you will participate in team meetings to share insights and contribute to the overall strategy of the data platform. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with data modeling and database design.- Familiarity with cloud-based data solutions and architectures.- Knowledge of data governance and data quality principles. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake Data Engineer1 were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks

Posted 2 months ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Pune

Work from Office

Snowflake data engineer1 As mentioned earlier, for this role, were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks

Posted 2 months ago

Apply

8.0 - 10.0 years

20 - 35 Lacs

Greater Noida

Work from Office

We are seeking a seasoned Informatica CDGC expert to work with Informatica team and lead the implementation and optimization of Informatica Cloud Data Governance and Catalog solutions. The ideal candidate will establish best practices, drive data governance initiatives, and mentor a team of data professionals to ensure a scalable and efficient governance framework aligned with business objectives Roles and Responsibilities Lead the end-to-end implementation of Informatica Cloud Data Governance and Catalog (CDGC) solutions, ensuring timely and high-quality delivery. Design, configure, and deploy data governance frameworks using Informatica CDGC aligned with organizational standards and compliance requirements. Develop and implement best practices for metadata management, data lineage, data quality, and stewardship within the Informatica CDGC environment. Collaborate with cross-functional teams, including data architects, engineers, analysts, and business stakeholders to drive data governance adoption. Provide expert guidance on data governance policies, workflows, and tool utilization to maximize the value of Informatica CDGC. Mentor and coach team members on technical and governance aspects of Informatica CDGC, fostering skill development and knowledge sharing. Troubleshoot and resolve complex technical issues related to Informatica CDGC deployment and integrations. Stay current with Informatica CDGC product updates, industry trends, and data governance best practices to continuously enhance governance capabilities. Create and maintain documentation, including architecture diagrams, configuration guides, and training materials. Support audit and compliance activities related to data governance and metadata management. Proven experience working with Informatica Data Governance and Catalog tools, preferably Cloud Data Governance and Catalog (CDGC). Strong understanding of data governance concepts, metadata management, data lineage, and data quality principles. Hands-on experience implementing and configuring Informatica CDGC solutions in enterprise environments. Proficiency with ETL/ELT processes, metadata integration, and data cataloging. Solid knowledge of data management frameworks and regulatory compliance (e.g., GDPR, CCPA). Excellent problem-solving and analytical skills with the ability to mentor and lead a team. Strong communication skills with experience working across technical and business stakeholders. Ability to create and deliver training sessions, workshops, and detailed technical documentation.

Posted 2 months ago

Apply

5.0 - 10.0 years

18 - 22 Lacs

Pune

Work from Office

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make an outstanding addition to our vibrant team. Siemens Mobility is an independent handled company of Siemens AG. Its core business includes rail vehicles, rail automation and electrification solutions, turnkey systems, intelligent road traffic technology and related services. The Information Technology (IT) department has the global responsibility for the internal IT of Siemens Mobility. Its goal is to provide a robust and efficient IT landscape derived from business and market demands. Your personality and individuality make the difference. In our team, we increase business performance and point the way into the digital age. Is that exactly your thingThen live your passion in a cross-location team in which you can actively craft the future of our company. You open up new possibilities for our customers with your competence. Connected with this is an exciting career path that leads you to ever new projects and solutions in the field of IT for Siemens Mobility. We are looking for an Enterprise Architect Youll make a difference by As member of our global Enterprise Architecture team at SMO IT you play an important role in digital transition and establishing the enterprise function in our company as enabling partner for our business to achieve business goals and position IT as enabler. As part of a global team, you will be responsible for enterprise architecture management (including business IT alignment and analysis of the application portfolio) and derive IT strategies from business requirements. Lead IT transformation programs passionate about optimizing the application portfolio to support business objectives and improve operational efficiency. Ensure interoperability of different applications within the enterprise according to modern integration pattern and domain driven design. Provide architectural guidance and governance for new solutions, services and ensure their compliance with the existing architectural landscape. Define intentional architecture as a purposeful set of statements, models, and decisions that represent some future architectural state, and derive the roadmap to achieve this state. Shape the target operation model on cloud infrastructures together with the cloud competence center Apply and contribute to governance frameworks that ensure compliance, security, and alignment with enterprise architecture principles. Engage with key collaborators to establish IT as enabler and partner for value proposition, and ensure that enterprise architecture initiatives meet business needs. Shift from experience-based to fact-based decision-making by the means of the enterprise repository Youll win us over by Bachelor/masters degree or equivalent experience in computer science or comparable education with corresponding additional skills Proven experience in enterprise architecture, with a focus on IT transformation programs in enterprise context Proven skills in the field of Enterprise Architecture Management, relevant certifications (e.g., TOGAF, PMP) is a plus. Experience with LeanIX for EAM is a plus Sound knowledge from anyone or more aspect of Business Architecture, Application Architecture, Data Architecture, Technology Architecture. Experience in architecture design and modeling Excellent communication and interpersonal skills, with the ability to actively engage and influence senior leadership and key collaborators. Experience in leading IT projects, including budget management and resource allocation with the ability to align IT initiatives with business objectives. Ability to lead multiple priorities in a fast-paced, dynamic environment, and adapt to changing business needs. Understand the business & technical architecture concepts for sophisticated IT architectures. Knowledge in IT governance, risk management, and emerging technologies; familiarity with application portfolio management tools and techniques is a plus. Experiences in one of our technology domainsIoT, AI, Cloud native ideally in combination with domain expertise in logistics, manufacturing or other subject areas in railways industries Join us and be yourself! We value your outstanding identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and build a better tomorrow with us. Make your mark in our exciting world at Siemens. This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and inspiration and help us shape tomorrow. Find out more about Siemens careers at & more about mobility at https://new.siemens.com/global/en/products/mobility.html

Posted 2 months ago

Apply

10.0 - 15.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Do you want to help create the future of healthcareOur name, Siemens Healthineers, was selected to honor our people who dedicate their energy and passion to this cause. It reflects their pioneering spirit combined with our long history of engineering in the ever-evolving healthcare industry. We offer you a flexible and dynamic environment with opportunities to go beyond your comfort zone in order to grow personally and professionally. Sound interesting Then come and join our global team as an Enterprise Architect (f/m/d) in IT to design the enterprise architecture for a large business unit or the entire company and to be responsible for the application landscape as well as for the technologies and development tools used. Your tasks and responsibilities: You will be responsible for enterprise architecture management (including business IT alignment and analysis of the application portfolio) of a large business unit or process domain and derive IT strategies from business requirements, ensuring alignment with the overall enterprise architecture You will drive the architecture roadmap and the application and data architecture for the business unit with a focus on security, scalability, and reliability of the IT landscape You will prepare decisions on the use of new technologies and platforms You will model IT architecture and processes and promote consistent design, planning and implementation of IT solutions You will be responsible for the coordination of communication with all important decision makers and relevant stakeholders and advise them on the development of the IT landscape You will drive the composition of the IT landscape and balance organizational needs with enterprise architecture decisions and objectives You will identify digitalization opportunities and synergies within the system landscape and represent system interrelationships holistically Restricted To find out more about the specific business, have a look at https://www.siemens-healthineers.com/products-services Your qualifications and experience: You have a degree in computer science, Industrial Engineering or a comparable qualification You have 10+ years of experience in global IT organizations, ideally in a mix of operational and architecture roles You have 5+ years of experience as a solution / application or enterprise architect Based on your very good understanding of complex IT processes and your openness to new technologies, you have acquired in-depth knowledge of software development, application management, enterprise architecture, enterprise architecture methodologies, governance structures and frameworks (e.g. TOGAF) You also have deep technological expertise and several years of experience in complex technology landscapes Functional or IT implementation experience across all key IT functions with a focus on PLM, SCM, Order-to-Cash and Accounting In-depth knowledge in at least one business domain such as CRM/Sales, R&D/PLM or SCM You have experience in business process analysis and modelling You bring several years of proven experience with working on different business process management models with Enterprise Architecture tools such as LeanIX or BizzDesign Further, you have a very good understanding of the interrelationships between functional business and technical IT structures Your attributes and skills: For working with specialist departments at home and abroad, we require very good English language skills, both spoken and written. Ideally you also have very good German language skills Restricted You are an organizational talent and impress with good communication and presentation skills- at very different levels in the organizational hierarchy You are a team player with a high level of social competence who can operate confidently in a global environment We don't compromise on quality - you work results- and quality-oriented with high commitment and possess good analytical and conceptual skills You are flexible in thought and action, have a quick grasp and constructive assertiveness Our global team: Siemens Healthineers is a leading global medical technology company. 50,000 dedicated colleagues in over 70 countries are driven to shape the future of healthcare. An estimated 5 million patients across the globe benefit every day from our innovative technologies and services in the areas of diagnostic and therapeutic imaging, laboratory diagnostics and molecular medicine, as well as digital health and enterprise services. Our culture: Our culture embraces different perspectives, open debate, and the will to challenge convention. Change is a constant aspect of our work. We aspire to lead the change in our industry rather than just react to it. Thats why we invite you to take on new challenges, test your ideas, and celebrate success. Check our Careers Site at https://www.siemens-healthineers.com/de/careers As an equal opportunity employer, we welcome applications from individuals with disabilities.

Posted 2 months ago

Apply

9.0 - 11.0 years

13 - 17 Lacs

Pune

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Design, develop, and maintain scalable data pipelines on Databricks using PySpark Collaborate with data analysts and scientists to understand data requirements and deliver solutions Optimize and troubleshoot existing data pipelines for performance and reliability Ensure data quality and integrity across various data sources Implement data security and compliance best practices Monitor data pipeline performance and conduct necessary maintenance and updates Document data pipeline processes and technical specificationsLocation of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Mysore, Kolkata, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag. While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have 9+ years of experience in data engineering Proficiency with Databricks and Apache Spark Strong SQL skills and experience with relational databases Experience with big data technologies (e.g., Hadoop, Kafka) Knowledge of data warehousing concepts and ETL processes Experience with CI/CD tools, particularly Jenkins Excellent problem-solving and analytical skills Solid understanding of big data fundamentals and experience with Apache Spark Familiarity with cloud platforms (e.g., AWS, Azure) Experience with version control systems (e.g., BitBucket) Understanding of DevOps principles and tools (e.g., CI/CD, Jenkins) Databricks certification is a plus Preferred Skills: Technology-Big Data-Big Data - ALL Technology-Cloud Integration-Azure Data Factory (ADF) Technology-Cloud Platform-AWS Data Analytics-AWS Data Exchange

Posted 2 months ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Architecture Principles Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles.- Strong understanding of application development methodologies.- Experience with database design and management.- Familiarity with cloud computing concepts and services.- Ability to analyze and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Architecture Principles.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies