Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing solutions to enhance business operations and streamline processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions.- Develop and implement software solutions to meet business needs.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated on industry trends and technologies to suggest improvements and enhancements.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data warehousing concepts and tools.- Hands-on experience in developing and maintaining data pipelines.- Knowledge of SQL and database management systems. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Mumbai office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Chennai
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will perform maintenance, enhancements, and/or development work in a dynamic environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to analyze, design, and develop software solutions.- Implement best practices in software development and ensure code quality.- Participate in code reviews and provide constructive feedback to team members.- Contribute to the continuous improvement of development processes and methodologies.- Stay updated with the latest technologies and trends in software development. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Management & Architecture.- Strong understanding of data modeling and data architecture.- Experience with SAP data migration tools and techniques.- Hands-on experience in SAP data governance and data quality management.- Knowledge of SAP integration technologies and platforms. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP Master Data Management & Architecture.- This position is based at our Chennai office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Noida
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Platform (AEP) Good to have skills : Java Enterprise EditionMinimum 12 year(s) of experience is required Educational Qualification : Min 15 years of continuous education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Adobe Experience Platform (AEP)Good to Have Skills :Java Enterprise EditionJob :Key Responsibilities :Lead AEP based project deliveries as architect for experience transformation leveraging AEP with integrations to other platforms or legacy systems for industry specific use cases such as Retail, Banking etcSolution Design:Collaborate with stakeholders to understand business requirements and translate them into scalable and efficient Adobe Experience Platform CDP solutions Design data architecture, integration patterns, and workflows to optimize the collection, storage, and activation of cust Technical Experience :Adobe AEP expertise with more than 2 years of experience in leading AEP based implementationsExtensive experience as a technical architect or solution architect, with a focus on customer data platforms CDPOverall knowledge of capturing customer data from different data sources to aggregate and generate customer insights with analytics products Knowledge and experience on offer decisioning and marketing activations through AJO, Target etcExperience in data driven experience transformatio Professional Attributes :Good verbal written communication skills to connect with customers at varying levels of the organization Ability to operate independently and make decisions with little direct supervision c:Effective Co-ordination and Analytical skillsLeadership skills to lead a team of AEP specialists, marketing Educational Qualification:Min 15 years of continuous educationAdditional Info :NA Qualification Min 15 years of continuous education
Posted 1 month ago
15.0 - 20.0 years
18 - 22 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various relevant components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, thereby enhancing the overall functionality and efficiency of the data architecture. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Evaluate and recommend new technologies and tools that can enhance the data platform's performance and scalability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Amazon Web Services (AWS).- Strong understanding of cloud architecture and data management principles.- Experience with data modeling and database design.- Familiarity with data integration tools and ETL processes.- Knowledge of security best practices in cloud environments. Additional Information:- The candidate should have minimum 7.5 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), AWS Glue, AWS Lambda AdministrationMinimum 5 year(s) of experience is required Educational Qualification : Graduate Summary :As a Snowflake Data Warehouse Architect, you will be responsible for leading the implementation of Infrastructure Services projects, leveraging our global delivery capability. Your typical day will involve working with Snowflake Data Warehouse, AWS Glue, AWS Lambda Administration, and Python programming language. Roles & Responsibilities:- Lead the design and implementation of Snowflake Data Warehouse solutions for Infrastructure Services projects.- Collaborate with cross-functional teams to ensure successful delivery of projects, leveraging AWS Glue and AWS Lambda Administration.- Provide technical guidance and mentorship to junior team members.- Stay updated with the latest advancements in Snowflake Data Warehouse and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Snowflake Data Warehouse.- Good To Have Skills: Proficiency in Python programming language, AWS Glue, and AWS Lambda Administration.- Experience in leading the design and implementation of Snowflake Data Warehouse solutions.- Strong understanding of data architecture principles and best practices.- Experience in data modeling, data integration, and data warehousing.- Experience in performance tuning and optimization of Snowflake Data Warehouse solutions. Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse.- The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualification Graduate
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly to support business operations. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with organizational goals. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:a 9+ of experience in Database & Data warehousing with at least 4+ years on Snowflakeb Played a key role in data related discussions with teams and clients to understand business problems and solutioning requirementsc Liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.Technical Expertisea Strong Experience working as a Snowflake on Cloud Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelinesc good process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese Ability to suggest innovative solutions based on new technologies and latest trendsf SnowPro core certificationg Certified in any one cloudh Develop fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.i Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.j Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.k Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.l Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.M stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.N Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
9 - 13 Lacs
Navi Mumbai
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work seamlessly together to support the organization's data needs and objectives. Your role will require you to analyze existing systems, propose improvements, and implement solutions that align with best practices in data management and governance. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Monitor and evaluate the performance of data systems, making recommendations for enhancements and optimizations. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with data integration tools and techniques.- Strong understanding of data modeling concepts and practices.- Familiarity with cloud-based data storage solutions and architectures.- Experience in implementing data governance frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As part of a Data Transformation programme you will be part of the Data Marketplace team. In this team you will be responsible for Architecture and design for automating data management compliance validation, monitoring, and reporting through rule-based and AI-driven mechanisms, integrating with metadata repositories and governance tools for real-time policy enforcement and for delivering design specifications for real-time metadata integration, enhanced automation, audit logging, monitoring capabilities, and lifecycle management (including version control, decommissioning, and rollback) Preferably experience with the implementation and adaptation of data management and data governance controls around Data Product implementations, preferably on AWS. Experience with AI appreciated.Examples skills Data Architecture, Data Marketplace, Data governance, Data Engineering, AWS DataZone, AWS Sagemaker Unified StudioAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive project success and foster a collaborative team environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data warehousing concepts and best practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 7.5 years of experience in AWS Glue.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality frameworks.- Ability to analyze and optimize data models for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Databricks Good to have skills : Microsoft Azure Analytics Services, Microsoft SQL ServerMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services- Good To Have Skills: Experience with Microsoft Azure Analytics Services- Strong understanding of cloud-based data services- Experience with Microsoft SQL Server- Knowledge of data architecture and design principles Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Data Services- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
2.0 - 3.0 years
4 - 5 Lacs
Bengaluru
Work from Office
Req ID: 331754 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr Alloy DB Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN). The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. You will collaborate with teams and supports emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge and support for applications development, integration, and maintenance. You will provide input to department and project teams on decisions supporting projects. Apply Disaster Recovery Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge IT - Design/Develop Application Solutions IT - Knowledge of Emerging Technology IT - Problem Management/Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Contribute to IS Projects; conducts systems and requirements analyses to identify project action items. Perform Analysis and Design; participates in defining and developing technical specifications to meet systems requirements. Design and Develop Moderate to Highly Complex Applications; Analyzes, designs, codes, tests, corrects, and documents moderate to highly complex programs to ensure optimal performance and compliance. Develop Application Documentation; Develops and maintains system documentation to ensure accuracy and consistency. Produce Integration Builds; Defines and produces integration builds to create applications. Performs Maintenance and Support; Defines and administers procedures to monitor systems performance and integrity. Support Emerging Technologies and Products; Monitors the industry to gain knowledge and understanding of emerging technologies. Basic qualifications: 7+ year of experience in Database engineering. 2-3 years of hands on experience with Alloy DB. 1. Database Design and Architecture Strong in database design with Alloy DB. Look at projects data requirements and be able to create database design. Plan database capacity and scalability strategies. 2. Performance Tuning and Optimization Identify and resolve performance bottlenecks in Alloy DB databases. Optimize queries, indexing, and storage utilization 3. Cloud Integration Implement and manage AlloyDB in Google Cloud environments. Integrate AlloyDB with other GCP services (e.g., BigQuery, Pub/Sub, Dataflow). 4. Data Migration Experience in migrating data involving postgres. Knowledgeable with data migration strategies pertaining to Alloy DB 5. Data Analytics Hands on experience with BigQuery Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST *This position is not open to employer sponsorship** #Launchjobs #LaunchEngineering
Posted 1 month ago
15.0 - 20.0 years
22 - 30 Lacs
Gurugram
Work from Office
We are seeking an experienced and visionary Data Architect with over 15 years of experience to lead the design and implementation of scalable, secure, and high-performing data architectures. The ideal candidate should have a deep understanding of cloud-native architectures, enterprise data platforms, and end-to-end data lifecycle management. You will work closely with business, engineering, and product teams to craft robust data solutions that drive business intelligence, analytics, and AI initiatives. Key Responsibilities: Design and implement enterprise-grade data architectures using cloud platforms (e.g., AWS, Azure, GCP). Lead the definition of data architecture standards, guidelines, and best practices. Architect scalable data solutions including data lakes, data warehouses, and real-time streaming platforms. Collaborate with data engineers, analysts, and data scientists to understand data requirements and deliver optimal solutions. Oversee data modeling activities including conceptual, logical, and physical data models. Ensure data security, privacy, and compliance with applicable regulations (e.g., GDPR, HIPAA). Define and implement data governance strategies in collaboration with stakeholders. Evaluate and recommend data-related tools and technologies. Provide architectural guidance and mentorship to data engineering teams. Participate in client discussions, pre-sales, and proposal building (if in a consulting environment). Required Skills & Qualifications: 15+ years of experience in data architecture, data engineering, or database development. Strong experience architecting data solutions on at least one major cloud platform (AWS, Azure, or GCP). Deep understanding of data management principles, data modeling, ETL/ELT pipelines, and data warehousing. Hands-on experience with modern data platforms and tools (e.g., Snowflake, Databricks, BigQuery, Redshift, Synapse, Apache Spark). Proficiency with programming languages such as Python, SQL, or Java. Familiarity with real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub. Experience implementing data governance, data cataloging, and data quality frameworks. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills. Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Cloud Architect or Data Architect certification (AWS/Azure/GCP) is a strong plus. Preferred Certifications: AWS Certified Solutions Architect - Professional Microsoft Certified: Azure Solutions Architect Expert Google Cloud Professional Data Engineer TOGAF or equivalent architecture frameworks What We Offer: A collaborative and inclusive work environment Opportunity to work on cutting-edge data and AI projects Flexible work options Competitive compensation and benefits package EXPERIENCE 16-18 Years SKILLS Primary Skill: Data Architecture Sub Skill(s): Data Architecture Additional Skill(s): Data Architecture
Posted 1 month ago
15.0 - 20.0 years
25 - 30 Lacs
Pune
Work from Office
Role Purpose: This is a critical role within ERP architecture function focusing on overall master data including Business Partner (Customer & Vendor data), Finance, Material, etc. As the global custodian of Enterprise master data architecture, the role holder will drive data integrity and consistency across the organization. They will collaborate closely with senior stakeholders, IT, Integration teams, and the Digital Transformation group to assess and manage changes to master data architecture. This role requires a strategic mindset to anticipate the impact of data modifications on various business processes and systems. The successful candidate will play a crucial part in ensuring overall enterprise data architecture, data quality, compliance and ultimately supporting informed decision-making and business growth. Accountabilities: Lead the data architecture design across multiple SAP products and instances along with non-SAP systems. Define data governance strategies, master data management (MDG), and metadata standards. Provide Leadership for execution of programs using these documented strategies. Architect data migration strategies for SAP S/4HANA projects and major programs, including ETL, data validation, and reconciliation. Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. Collaborate with functional and technical teams to ensure master data flow across SAP and non-SAP platforms. Establish data quality frameworks and monitoring practices. Drive the adoption of the frameworks with the Owning teams. Lead the orchestration of Data for usage in AI frameworks and visualisation platforms. Work with integration team to implement/optimize integration solutions to connect SAP/Non-SAP systems and ensure seamless master data flow. Exposure working and evaluating new technologies to improve master data quality and architecture. Exposure working on various compliance / regulations like SOX, GDPR and others. Exposure working on MDM / MDG implementation as data / enterprise architect. Familiar with new MDG cloud / AI based services. Ability to lead and influence stakeholders to drive data related initiatives. Exposure working on various data frameworks like TOGAF, DAMA and Zachman. Identify existing master data tools (including third party tools like DnB, Address Doctor) and exploit to its full potential. Ability to convert and design complex business requirements. Experience managing business escalation. Exposure managing mergers and acquisitions and making sure data strategy is aligned with Syngenta architecture. Commitment to staying current with evolving technologies and industry best practices. Willingness to acquire new skills and knowledge as required by this role. Self-motivated with the ability to work independently and take initiative. Required Capabilities / Skills: Familiarity with data architecture frameworks and best practices. Strong communication skills and the ability to explain complex concepts to non-technical stakeholders. Must be able to collaborate with cross-functional teams to solve complex problems and develop innovative solutions. Must have good troubleshooting skills and be able to help other people with their work. Familiarity with the latest Generative AI models. Good understanding of Data Integration technologies. Good understanding of API-First architecture. Good understanding of enterprise data projects (data warehouses/master data management etc. ) with hands-on data analysis. Experience with Agile and DevOps processes. Must be able to deal with ambiguity and adapt to changing business needs. Good team player, self-motivating, and committed with an ability to work under pressure and to meet tight deadlines. Ability to work independently and have knowledge of all aspects of the Software Development Lifecycle. Experience Requirements: 18-20 years of experience in managing master data processes and architecture. Proven track record in complex multinational environments. Demonstrated ability to implement and optimize data governance strategies. Good understanding of the Data architecture discipline, processes, concepts and analytics best practices. Data Governance, Security, Privacy and Compliance requirements for an enterprise. Possess strong knowledge of industry best practices around modern data architecture. Background in SAP/ERP, SAP MDG, BODS etc. is preferable.
Posted 1 month ago
14.0 - 20.0 years
45 - 50 Lacs
Bengaluru
Work from Office
Job Description: Job Title: Data Technical Lead As the Data Management Platform (DMP) Technical Lead , you will be responsible for embedding a world class product development and engineering culture and organization. You will work with development, architecture and operations as well as platform teams to ensure we are delivering a best-in-class technology solution. You will work closely together with the Business Platform Owner to ensure an integrated end-to-end view across people and technology for the Business Platform. You will also defend the faith and work with stakeholders across the enterprise to ensure we are developing the right solutions. In parallel, you will focus on building a high-performing team that will thrive in a fast-paced continuous delivery engineering environment The role involves architecting, designing, and delivering solutions in tool stack including Informatica MDM SaaS, Informatica Data Quality, Collibra Data Governance , and other data tools. Key responsibilities: Shape technical strategy (e.g., build vs. buy decisions, technical road-mapping) in collaboration with architects Evaluate and identify appropriate technology stacks, platforms and vendors, including web application frameworks and cloud providers for solution development Attend team ceremonies as required; in particular, feature refinement and cross-team iteration reviews/demos Drive the resolution of technical impediments Own the 'success' of foundational enablers Champion for Research and Innovation Lead in scaled agile ceremonies and activities, like quarterly reviews, quarterly increment planning and OKR writing Collaborate with the Platform Owner in the writing and prioritization of technical capabilities and enablers Present platform delivery metrics, OKR health and platform finance status to Executive audiences Collaborate with other Technical Leads Create and maintain the technical roadmap for in-scope products and services at the platform/portfolio level Key Experience: B.E. / B.Tech or equivalent Engineering professional Masters degree or equivalent experience in Marketing, Business or finance is an added advantage 10+ yrs. of experience in technical architecture, solution design, and platform engineering Strong experience in MDM, Data Quality and Data Governance practices including tool stack such as Informatica MDM SaaS, Informatica Data Quality, and Collibra is a plus Good experience with major cloud platforms and data tools in cloud including but not limited to AWS, Microsoft Azure, Kafka, CDC, Tableau , and Data virtualization tools Good experience in ETL and BI solution development and tool stack – Informatica ETL experience is a plus Good experience in Data Architecture, SQL, NoSQL, REST API, data security, and AI concepts Familiarity with agile methodologies and data factory operations processes including usage of tools like confluence, Jira and Miro Strong knowledge of industry standards and regulations: A data platform owner should have knowledge of industry standards and regulations related to data management, such as HIPAA, PCI-DSS, and GDPR . Proven knowledge of working in financial services, preferably, insurance space Experience in senior engineering and technology roles working with teams to build/deliver digital products Experience in providing guidance and insight to establish governance processes, direction, and control, to ensure objectives are achieved and risks are managed appropriately for product development A leader who has a track record of onboarding, and developing engineering and product teams Experience as a technology leader who has defined and implemented technical strategies within complex organizations and is able to Influence and contribute to the higher-level engineering strategy Has insight into the newest technologies and trends and is an expert in product development with experience in code delivery and management of full stack technology Experience in digital capabilities such as DevSecOps, CI/CD, and agile release management A wide experience and understanding of architecture in terms of solution, data, and integration Can provide direct day-to-day engineering and technology problem-solving, coaching, direction, and guidance to Technical Leads and Senior Technical Leads within their Platform Strong leadership skills with an ability to influence a diverse group of stakeholders Ability to influence technical strategy at the BG and Enterprise level Experience working in Agile teams with a strong understanding of agile ways of working Experience managing technical priorities within an evolving product backlog Understands how to decompose large technical initiatives into actionable technical enablers Experience in the continuous improvement of software development workflows and pipelines Proven leadership; ability to articulate ideas to both technical and non-technical audiences Ability to communicate strategy and objectives, and align organizations to a common goal Strong problem solver with ability to lead the team to push the solution Ability to empower teams and encourage collaboration Ability to inspire people and teams, building momentum around a vision Critical thinker and passion to challenge status quo to find new solutions and drive out of the box ideas Believes in a non-hierarchical culture of collaboration, transparency and trust across the team Experimental mindset to drive innovation and continuous improvement of team
Posted 1 month ago
8.0 - 13.0 years
7 - 14 Lacs
Pune
Hybrid
Key Skills: Application Architecture, Cloud Architecture, Data Architecture, Integration Architecture, Java, SOA, and Microservices-Based Architecture. Roles and Responsibilities: Develop web applications using Java Spring Boot and Angular 2. Create and maintain codebase and database structures. Debug, troubleshoot, and optimize code to improve performance and scalability. Collaborate with other developers to ensure code standards and best practices are followed. Design and create RESTful APIs. Perform unit and integration testing. Develop and deliver high-quality, scalable web applications. Document coding decisions, development processes, and technical specifications. Research and implement new technologies to enhance web applications. Monitor web application security and provide recommendations for improvements. Provide technical support and assistance for existing web applications. Ensure all web applications are compliant with industry standards and best practices. Experience Requirements: 8 to 15 years of software development experience. Proven experience in Java, Spring Boot, and Angular 2. Experience with web application security and performance optimization. Experience working with web development libraries and frameworks. Working knowledge of Agile and Scrum methodologies. Excellent communication and problem-solving skills. Ability to collaborate with cross-functional teams. Ability to work independently and in a team environment. Strong attention to detail and organizational skills. Education: B.E., B.Tech, B. Sc.
Posted 1 month ago
10.0 - 15.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Job Summary: We are seeking an experienced and detail-oriented Lead Data Modeler to design, implement, and manage robust data models that support our enterprise data warehouse and analytics ecosystem. The ideal candidate will have deep expertise in data analysis, dimensional modeling, and database design, with strong hands-on experience using tools like ERwin to create scalable, optimized, and maintainable data structures. Key Responsibilities: Analyze business requirements and translate them into logical and physical data models. Design and develop enterprise-level conceptual, logical, and physical data models for OLTP and OLAP systems. Build and maintain dimensional models (star/snowflake schemas) to support business intelligence and reporting needs. Lead the data modeling and data architecture efforts for large-scale data warehouse and data integration projects. Define data standards, naming conventions, metadata, and data lineage documentation. Work closely with business analysts, data engineers, and application developers to ensure alignment of data structures with business needs. Collaborate with DBAs and ETL developers to implement models in physical databases. Use ERwin (or similar data modeling tools) to create and manage models and reverse-engineer existing structures. Ensure data models are optimized for performance, scalability, and data integrity. Participate in data governance initiatives and data quality improvement projects. Required Skills & Qualifications: 8-12 years of experience in data modeling, data architecture, and database design. Strong experience with dimensional modeling, data warehouse architecture, and data mart development. Strong Proficiency in ERwin Data Modeler (or equivalent tools such as PowerDesigner). Solid understanding of relational databases (e.g., SQL Server, Oracle, PostgreSQL) and data warehouse platforms. Strong SQL skills and ability to analyze and understand complex data sets. Experience with data integration, ETL, and data governance principles. Ability to manage multiple priorities in a fast-paced environment and work collaboratively across teams. Strong communication and documentation skills. Preferred Qualifications: Experience in cloud-based data platforms (e.g., Azure Synapse, Snowflake, AWS Redshift). Familiarity with Data Vault modeling or NoSQL modeling is a plus. Experience in Agile/Scrum environments and using version control tools (e.g., Git)
Posted 1 month ago
1.0 - 5.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Join us as a Data Engineer, PySpark This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences You ll be simplifying the bank by developing innovative data driven solutions, using insight to be commercially successful, and keeping our customers and the bank s data safe and secure Participating actively in the data engineering community, you ll deliver opportunities to support the bank s strategic direction while building your network across the bank What youll do As a Data Engineer, you ll play a key role in driving value for our customers by building data solutions. You ll be carrying out data engineering tasks to build, maintain, test and optimise a scalable data architecture, as well as carrying out data extractions, transforming data to make it usable to data analysts and scientists, and loading data into data platforms. You ll also be: Developing comprehensive knowledge of the bank s data structures and metrics, advocating change where needed for product development Practicing DevOps adoption in the delivery of data engineering, proactively performing root cause analysis and resolving issues Collaborating closely with core technology and architecture teams in the bank to build data knowledge and data solutions Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions Sourcing new data using the most appropriate tooling and integrating it into the overall solution to deliver for our customers The skills youll need To be successful in this role, you ll need a good understanding of data usage and dependencies with wider teams and the end customer, as well as experience of extracting value and features from large scale data. Youll need at least five years of experience in PySpark, Python, SQL, CICD, Gitlab and AWS. You ll also demonstrate: Experience of ETL technical design, including data quality testing, cleansing and monitoring, and data warehousing and data modelling capabilities Experience of using programming languages alongside knowledge of data and software engineering fundamentals Good knowledge of modern code development practices Strong communication skills with the ability to proactively engage with a wide range of stakeholders Hours 45 Job Posting Closing Date: 11/07/2025
Posted 1 month ago
3.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Work from Office
We have an exciting and rewarding opportunity for you to take your AI ML career to the next level. As a Senior Software Engineer in the Self Service Enablement team, you will lead the development and innovation of agentic applications using LLM technologies. You will work closely with a talented team to design scalable, resilient applications with strong observability, contributing to JP Morgan Chases mission of delivering exceptional self-service solutions. Your role will be pivotal in driving innovation and enhancing the user experience, making a difference in the lives of our clients and the wider community. Job Responsibilities Develop solutions related to data architecture, ML Platform as well as GenAI platform architecture, provide tactical solution and design support to the team and embedded with engineering on the execution and implementation of processes and procedures Serve as a subject matter expert on a wide range of ML techniques and optimizations. Provide in-depth knowledge of distributed ML platform deployment including training and serving. Create curative solutions using GenAI workflows through advanced proficiency in large language models (LLMs) and related techniques. Gain Experience with creating a Generative AI evaluation and feedback loop for GenAI/ML pipelines. Get Hands on code and design to bring the experimental results into production solutions by collaborating with engineering team. Own end to end code development in python/Java for both proof of concept/experimentation and production-ready solutions. Optimize system accuracy and performance by identifying and resolving inefficiencies and bottlenecks and collaborate with product and engineering teams to deliver tailored, science and technology-driven solutions. Drives decisions that influence the product design, application functionality, and technical operations and processes. Required Qualifications, Capabilities, and Skills Formal training or certification on AI ML concepts and 3+ years applied experience Solid understanding of using ML techniques specially in Natural Language Processing (NLP) and Large Language Models (LLMs) Hands-on experience with machine learning and deep learning methods. Good understanding in deep learning frameworks such as PyTorch or TensorFlow. Experience in advanced applied ML areas such as GPU optimization, finetuning, embedding models, inferencing, prompt engineering, evaluation, RAG (Similarity Search). Deep understanding of Large Language Model (LLM) techniques, including Agents, Planning, Reasoning, and other related methods. Practical cloud native experience such as AWS Preferred Qualifications, Capabilities, and Skills Experience with Ray, MLFlow, and/or other distributed training frameworks. In-depth understanding of Embedding based Search/Ranking, Recommender systems, Graph techniques, and other advanced methodologies. Experience with building and deploying ML models on cloud platforms such as AWS and AWS tools like Sagemaker. Exposure to agentic frameworks such as Langchain, Langgraph, RASA, Parlant, Decagon. We have an exciting and rewarding opportunity for you to take your AI ML career to the next level. As a Senior Software Engineer in the Self Service Enablement team, you will lead the development and innovation of agentic applications using LLM technologies. You will work closely with a talented team to design scalable, resilient applications with strong observability, contributing to JP Morgan Chases mission of delivering exceptional self-service solutions. Your role will be pivotal in driving innovation and enhancing the user experience, making a difference in the lives of our clients and the wider community. Job Responsibilities Develop solutions related to data architecture, ML Platform as well as GenAI platform architecture, provide tactical solution and design support to the team and embedded with engineering on the execution and implementation of processes and procedures Serve as a subject matter expert on a wide range of ML techniques and optimizations. Provide in-depth knowledge of distributed ML platform deployment including training and serving. Create curative solutions using GenAI workflows through advanced proficiency in large language models (LLMs) and related techniques. Gain Experience with creating a Generative AI evaluation and feedback loop for GenAI/ML pipelines. Get Hands on code and design to bring the experimental results into production solutions by collaborating with engineering team. Own end to end code development in python/Java for both proof of concept/experimentation and production-ready solutions. Optimize system accuracy and performance by identifying and resolving inefficiencies and bottlenecks and collaborate with product and engineering teams to deliver tailored, science and technology-driven solutions. Drives decisions that influence the product design, application functionality, and technical operations and processes. Required Qualifications, Capabilities, and Skills Formal training or certification on AI ML concepts and 3+ years applied experience Solid understanding of using ML techniques specially in Natural Language Processing (NLP) and Large Language Models (LLMs) Hands-on experience with machine learning and deep learning methods. Good understanding in deep learning frameworks such as PyTorch or TensorFlow. Experience in advanced applied ML areas such as GPU optimization, finetuning, embedding models, inferencing, prompt engineering, evaluation, RAG (Similarity Search). Deep understanding of Large Language Model (LLM) techniques, including Agents, Planning, Reasoning, and other related methods. Practical cloud native experience such as AWS Preferred Qualifications, Capabilities, and Skills Experience with Ray, MLFlow, and/or other distributed training frameworks. In-depth understanding of Embedding based Search/Ranking, Recommender systems, Graph techniques, and other advanced methodologies. Experience with building and deploying ML models on cloud platforms such as AWS and AWS tools like Sagemaker. Exposure to agentic frameworks such as Langchain, Langgraph, RASA, Parlant, Decagon.
Posted 1 month ago
12.0 - 15.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Req ID: 323745 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data & AI Technical Solution Architects to join our team in Hyderabad, Telangana (IN-TG), India (IN). Job Duties: The Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client s technology infrastructure. Key Responsibilities: Ability and experience to have conversations with the CEO, Business owners and CTO/CDO Break down intricate business challenges, devise effective solutions, and focus on client needs. Craft high level innovative solution approach for complex business problems Utilize best practices and creativity to address challenges Leverage market research, formulate perspectives, and communicate insights to clients Establish strong client relationships Interact at appropriate levels to ensure client satisfaction Knowledge and Attributes: Ability to focus on detail with an understanding of how it impacts the business strategically. Excellent client service orientation. Ability to work in high-pressure situations. Ability to establish and manage processes and practices through collaboration and the understanding of business. Ability to create new and repeat business for the organization. Ability to contribute information on relevant vertical markets Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills Required: Academic Qualifications and Certifications: BE/BTech or equivalent in Information Technology and/or Business Management or a related field. Scaled Agile certification desirable. Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience: 12-15 years Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. Very good understanding of Data, AI, Gen AI and Agentic AI Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. Must be able to work on Data & AI RFP responses as Solution Architect 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools Experience with large scale consulting and program execution engagements in AI and data Seasoned multi-technology infrastructure design experience. Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. Additional Job Description Additional Job Description Additional Career Level Description: Knowledge and application: Seasoned, experienced professional; has complete knowledge and understanding of area of specialization. Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. Works
Posted 1 month ago
6.0 - 11.0 years
20 - 25 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Responsibilities Work closely with Stakeholders to understand the requirements and apply analytics and visualization to achieve business objectives Engineer end-to-end solutions including data acquisition, data modeling, table creation and building dashboards and reports. Design and develop Tableau reports and dashboards that will yield actionable insights that present the answers to business questions Code and modify SQL/ETL based on dashboard requirements Analyze unstructured / Semi structured data and derive insights Run ad-hoc analysis for Product and Business Managers using standard query languages and operationalize for repeatable use via Tableau reporting suite Ensure designs support corporate IT strategy, established technical standards, and industry best practices Provide technical guidance to project and is responsible for developing and presenting design artifacts within the business and Development teams Identify project issues/risks and present alternatives to alleviate or resolve Core Competencies Strong quantitative and analytical skills - ability to quickly analyze data to identify key insights and apply them to the business Strong visualization design and development experience with Tableau (and other Business Intelligence tools like PowerBI) Experience leading analysis, architecture, design and development of business intelligence solutions and using next generation data platforms Experience developing test strategies for data-centric applications, in Agile methodologies and in diagnosing complex technical issues Strong understanding of architectural standards and software development methodologies, expertise in industry best practices in data architecture and design Excellent communication skills, including ability to present effectively to both business and technical audiences at all levels of the organization Who You are? Qualifications Bachelor s degree in Engineering, Computer Science, or related field 6+ years of experience using business intelligence reporting tools, developing data visualizations and mastery of Tableau for the creation and automation of Enterprise Scale dashboards 6+ years of experience writing advanced SQL, performance tuning of BI queries, data modeling, and data mining from multiple sources (SQL, ETL, data warehousing) Experience performance tuning of Tableau Server dashboards to minimize rendering time Experience of data preparation/blending and ETL tools such as Alteryx or Talend or Tableau Prep Good knowledge on Tableau Metadata tables and Postgre SQL Server Reporting Experience in anyone programming languages like Python, R, etc would be a plus. Exposure to Snowflake or any Cloud data warehouse architecture would be an added advantage. Possess a strong foundation in data analytics, an understanding and exposure to data science Strong knowledge of data visualization and data warehouse best practices Certification in Tableau / Alteryx / Snowflake would be a plus
Posted 1 month ago
7.0 - 11.0 years
20 - 25 Lacs
Noida, Kolkata, Pune
Work from Office
Proficient in application, data, and infrastructure architecture disciplines. Advanced knowledge of architecture, design, and business processes. Hands-on experience with AWS. Proficiency in modern programming languages such as Python and Scala. Expertise in Big Data technologies like Hadoop, Spark, and PySpark. Experience with deployment tools for CI/CD, such as Jenkins. Design and develop integration solutions involving Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions. Apply system development lifecycle methodologies, such as Waterfall and Agile. Understand and implement data architecture and modeling practices, including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional modeling, and metadata modeling. Utilize knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development, and Big Data solutions. Work collaboratively in teams to develop meaningful relationships and achieve common goals. Strong analytical skills with deep expertise in SQL. Solid understanding of Big Data concepts, particularly with Spark and PySpark/Scala. Experience with CI/CD using Jenkins. Familiarity with NoSQL databases. Excellent communication skills.
Posted 1 month ago
6.0 - 10.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Not Applicable Specialism SAP Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary We are seeking a skilled and experienced Azure DevOps Expert to join our dynamic team. As an Azure DevOps Expert, you will be responsible for designing, implementing, and maintaining our DevOps infrastructure and processes on the Microsoft Azure platform. You will play a crucial role in optimizing our software development lifecycle, ensuring efficient collaboration, and delivering highquality products to our customers. Responsibilities Design and implement Azure DevOps strategies and solutions aligned with company goals. Configure and manage Azure DevOps services including Azure Pipelines, Azure Repos, Azure Boards, and Azure Artifacts. Develop and maintain CI/CD pipelines for building, testing, and deploying applications on Azure. Automate infrastructure provisioning, configuration management, and application deployments using Azure DevOps tools and scripting languages. Define and enforce code quality standards, best practices, and security measures within the DevOps environment. Monitor and analyze DevOps performance metrics to identify areas for improvement and optimization. Collaborate with development teams to integrate DevOps practices and tools into their workflows. Stay uptodate on the latest Azure DevOps features, best practices, and industry trends. Provide technical guidance and mentorship to team members on Azure DevOps concepts and practices. Mandatory skill sets CI/CD pipeline setup and management using Azure DevOps, Infrastructure as Code (IaC) using ARM templates, Terraform, or Bicep. Preferred skill sets Azure Monitor, Application Insights, Log Analytics Years of experience required 610 years Education qualification Bachelors degree in Computer Science, Engineering, or a related field. Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills Microsoft Azure Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?
Posted 1 month ago
5.0 - 10.0 years
10 - 11 Lacs
Pune
Work from Office
Job Overview: We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices.Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Hyderabad
Work from Office
At Modern, we are seeking a highly motivated and focused Data Engineers with a proven consulting background to join our Client Technology Team. As lead data engineers, you will act as a subject matter expert for our Enterprise Data Product Platform, DataOS . As a subject matter expert, you would provide technical expertise in needs identification, data modeling, data movement, and translating business needs into technical solutions. Responsibilities: Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Work closely with various stakeholders including business teams, data stewards, various consumers to define data product roadmap and bring them to fruition. Design and build a scalable, fault-tolerant solutions optimized for distributed computing and large-scale data processing using DataOS Optimize Spark-based workflows for performance, scalability, and cost efficiency. Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges. Actively participates with other consultants in problem-solving and approach development. Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture. Coordinate with Data Architects, Program Managers and participate in recurring meetings. Requirements Bachelor\u2019s degree in computer science, business analytics or similar discipline. 5+ years of experience in data engineering, architecture, or analytics roles. Strong working knowledge of Spark, Python, SQL and API Integration frameworks is a must Working experience in Modern data architecture and modeling concepts, including Cloud data lakes, data warehouses, and data marts. Familiarity with dimensional modeling, star schemas, and real-time/batch ETL pipelining including experience with data streaming (Kafka). Expertise in ensuring data quality and reliability using monitoring and validation techniques. Working experience with tools such as DBT, Airflow, or similar is a plus Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc). Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc. Benefits Working at Modern First and foremost, our value system is sacred to us \u2013 HEAT: Humility, Empathy, Accountability and Transparency. Beyond this, we are fond of individuals who are curious, love solving problems and can see the larger picture. We love to take a leap of faith on potential. If you believe you haven\u2019t had the chance to do your life\u2019s best work, Modern is the place for you. Modern embraces competition for great talent. We have been able to get great talent onboard owing to the attractive compensation and benefits we provide, in addition to the upside we share with all our employees in the form of ESOPs. Moreover, our ESOP policies are highly employee-friendly replicating ethos of some of the best Silicon Valley tech startups. We are committed to making sure our employees create significant value for themselves.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |