Jobs
Interviews

1814 Data Architecture Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

19 - 25 Lacs

Hyderabad

Work from Office

Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.

Posted 1 month ago

Apply

8.0 - 12.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Job Title:Hadoop Engineer Experience8-12 YearsLocation:Bangalore : DevOps and CI/CDDesign, implement, and manage CI/CD pipelines using tools like Jenkins and GitOps to automate and streamline the software development lifecycle. Containerization and OrchestrationDeploy and manage containerized applications using Kubernetes and OpenShift, ensuring high availability and scalability. Infrastructure ManagementDevelop and maintain infrastructure as code (IaC) using tools like Terraform or Ansible. Big Data SolutionsArchitect and implement big data solutions using technologies such as Hadoop, Spark, and Kafka. Distributed SystemsDesign and manage distributed data architectures to ensure efficient data processing and storage. CollaborationWork closely with development, operations, and data teams to understand requirements and deliver robust solutions. Monitoring and OptimizationImplement monitoring solutions and optimize system performance, reliability, and scalability. Security and ComplianceEnsure infrastructure and data solutions adhere to security best practices and regulatory requirements. Technical Skills: Proficiency in CI/CD tools such as Jenkins and GitOps. Strong experience with containerization and orchestration tools like Kubernetes and OpenShift. Knowledge of big data technologies such as Hadoop, Spark, ETLs. Proficiency in scripting languages such as Python, Bash, or Groovy. Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible.

Posted 1 month ago

Apply

10.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Job Title Data ArchitectExperience 10-12 YearsLocation Chennai : 10-12 years experience as Data Architect Strong expertise in streaming data technologies like Apache Kafka, Flink, Spark Streaming, or Kinesis. ProficiencyinprogramminglanguagessuchasPython,Java,Scala,orGo ExperiencewithbigdatatoolslikeHadoop,Hive,anddatawarehousessuchas Snowflake,Redshift,Databricks,MicrosoftFabric. Proficiencyindatabasetechnologies(SQL,NoSQL,PostgreSQL,MongoDB,DynamoDB,YugabyteDB). Should be flexible to work as anIndividual contributor

Posted 1 month ago

Apply

10.0 - 15.0 years

4 - 8 Lacs

Noida

Work from Office

Highly skilled and experienced Data Modeler to join Enterprise Data Modelling team. The candidate will be responsible for creating and maintaining conceptual logical and physical data models ensuring alignment with industry best practices and standards. Working closely with business and functional teams the Data Modeler will play a pivotal role in standardizing data models at portfolio and domain levels driving efficiencies and maximizing the value of clients data assets. Preference will be given to candidates with prior experience within an Enterprise Data Modeling team. The ideal domain experience would be Insurance or Investment Banking. Roles and Responsibilities: Develop comprehensive conceptual logical and physical data models for multiple domains within the organization leveraging industry best practices and standards. Collaborate with business and functional teams to understand their data requirements and translate them into effective data models that support their strategic objectives. Serve as a subject matter expert in data modeling tools such as ERwin Data Modeler providing guidance and support to other team members and stakeholders. Establish and maintain standardized data models across portfolios and domains ensuring consistency governance and alignment with organizational objectives. Identify opportunities to optimize existing data models and enhance data utilization particularly in critical areas such as fraud banking AML. Provide consulting services to internal groups on data modeling tool usage administration and issue resolution promoting seamless data flow and application connections. Develop and deliver training content and support materials for data models ensuring that stakeholders have the necessary resources to understand and utilize them effectively. Collaborate with the enterprise data modeling group to develop and implement a robust governance framework and metrics for model standardization with a focus on longterm automated monitoring solutions. Qualifications: Bachelors or masters degree in computer science Information Systems or a related field. 10 years of experience working as a Data Modeler or in a similar role preferably within a large enterprise environment. Expertise in data modeling concepts and methodologies with demonstrated proficiency in creating conceptual logical and physical data models. Handson experience with data modeling tools such as Erwin Data Modeler as well as proficiency in database environments such as Snowflake and Netezza. Strong analytical and problemsolving skills with the ability to understand complex data requirements and translate them into effective data models. Excellent communication and collaboration skills with the ability to work effectively with crossfunctional teams and stakeholders. problem-solving skills,business intelligence platforms,erwin,data modeling,database management systems,data warehousing,etl processes,big data technologies,agile methodologies,data governance,sql,enterprise data modelling,data visualization tools,cloud data services,analytical skills,data modelling tool,,data architecture,communication skills

Posted 1 month ago

Apply

6.0 - 8.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Role Description: As a Data Engineering Lead, you will play a crucial role in overseeing the design, development, and maintenance of our organization's data architecture and infrastructure. You will be responsible for designing and developing the architecture for the data platform that ensures the efficient and effective processing of large volumes of data, enabling the business to make informed decisions based on reliable and high-quality data. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a proven track record of successfully managing complex data projects. Responsibilities : Data Architecture and Design: Design and implement scalable and efficient data architectures to support the organization's data processing needs Work closely with cross-functional teams to understand data requirements and ensure that data solutions align with business objectives ETL Development: Oversee the development of robust ETL processes to extract, transform, and load data from various sources into the data warehouse Ensure data quality and integrity throughout the ETL process, implementing best practices for data cleansing and validation Big Data Technology - Stay abreast of emerging trends and technologies in big data and analytics, and assess their applicability to the organization's data strategy Implement and optimize big data technologies to process and analyze large datasets efficiently Cloud Integration: Collaborate with the IT infrastructure team to integrate data engineering solutions with cloud platforms, ensuring scalability, security, and performance. Performance Monitoring and Optimization: Implement monitoring tools and processes to track the performance of data pipelines and proactively address any issues Optimize data processing. Documentation: Maintain comprehensive documentation for data engineering processes, data models, and system architecture Ensure that team members follow documentation standards and best practices. Collaboration and Communication: Collaborate with data scientists, analysts, and other stakeholders to understand their data needs and deliver solutions that meet those requirements Communicate effectively with technical and non-technical stakeholders, providing updates on project status, challenges, and opportunities. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 6-8 years of professional experience in data engineering In-depth knowledge of data modeling, ETL processes, and data warehousing. In-depth knowledge of building the data warehouse using Snowflake Should have experience in data ingestion, data lakes, data mesh and data governance Must have experience in Python programming Strong understanding of big data technologies and frameworks, such as Hadoop, Spark, and Kafka. Experience with cloud platforms, such as AWS, Azure, or Google Cloud. Familiarity with database systems like SQL, NoSQL, and data pipeline orchestration tools .Excellent problem-solving and analytical skills .Strong communication and interpersonal skills. Proven ability to work collaboratively in a fast-paced, dynamic environment.

Posted 1 month ago

Apply

10.0 - 12.0 years

37 - 40 Lacs

Chennai

Remote

10+ yrs (5+ yrs as Lead Role) Proven Exp Legacy App modernization Strong expert in Enterprise Arch (Data/Apps/Cloud) AWS Ecosystem (IAM, VPC, CloudWatch, RDS, Secrets Mgr) Knowledge of Camunda for BPM/Workflow & AWS API Gateway-Service architecture. Required Candidate profile Working Time (1.30 - 10 p.m - IST) Lead End-to-End Engagement Delivery Define modernization roadmaps, architectural decisions Oversee Solution Design, Risk Mgt CI/CD using GitHub, SonarQube for Code

Posted 1 month ago

Apply

10.0 - 14.0 years

25 - 30 Lacs

Pune

Work from Office

We are seeking a highly experienced Principal Solution Architect to lead the design, development, and implementation of sophisticated cloud-based data solutions for our key clients. The ideal candidate will possess deep technical expertise across multiple cloud platforms (AWS, Azure, GCP), data architecture paradigms, and modern data technologies. You will be instrumental in shaping data strategies, driving innovation through areas like GenAI and LLMs, and ensuring the successful delivery of complex data projects across various industries. Key Responsibilities: Solution Design & Architecture: Lead the architecture and design of robust, scalable, and secure enterprise-grade data solutions, including data lakes, data warehouses, data mesh, and real-time data pipelines on AWS, Azure, and GCP. Client Engagement & Pre-Sales: Collaborate closely with clients to understand their business challenges, translate requirements into technical solutions, and present compelling data strategies. Support pre-sales activities, including proposal development and solution demonstrations Data Strategy & Modernization: Drive data and analytics modernization initiatives, leveraging cloud-native services, Big Data technologies, GenAI, and LLMs to deliver transformative business value Industry Expertise: Apply data architecture best practices across various industries (e.g., BFSI, Retail, Supply Chain, Manufacturing) Requireme ntsRequired Qualifications & Skills Experience: 10+ years of experience in IT, with a significant focus on data architecture, solution architecture, and data engineering. Proven experience in a principal-level or lead architect role Cloud Expertise: Deep, hands-on experience with major cloud platforms: Azure: (Microsoft Fabric, Data Lake, Power BI, Data Factory, Azure Purview ), good understanding of Azure Service Foundry, Agentic AI, copi lotGCP: (Big Query, Vertex.AI, Gemini) Data Science Leadership: Understanding and experience in integrating AI/ML capabilities, including GenAI and LLMs, into data solutions Leadership & Communication: Exceptional communication, presentation, and interpersonal skills. Proven ability to lead technical teams and manage client relationships Problem-Solving: Strong analytical and problem-solving abilities with a strategic minds Education: Bachelors or masters degree in computer science, Engineering, Information Technology, or a related field Preferred Qualifications Relevant certifications in AWS, Azure, GCP, Snowflake, or Databricks Experience with Agentic AI, hyper-intelligent automation

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

vadodara, gujarat

On-site

The purpose of your role is to define and develop Enterprise Data Structure, Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening modeling standards and business information. You will define and develop Data Architecture that supports the organization and clients in new/existing deals. This includes partnering with business leadership to provide strategic recommendations, assessing data benefits and risks, creating data strategy and roadmaps, engaging stakeholders for data governance, ensuring data storage and database technologies are supported, monitoring compliance with Data Modeling standards, overseeing frameworks for data management, and collaborating with vendors and clients to maximize the value of information. Additionally, you will be responsible for building enterprise technology environments for data architecture management. This involves developing, maintaining, and implementing standard patterns for data layers, data stores, data hub & lake, evaluating implemented systems, collecting and integrating data, creating data models, implementing best security practices, and demonstrating strong experience in database architectures and design patterns. You will also enable Delivery Teams by providing optimal delivery solutions and frameworks. This includes building relationships with delivery and practice leadership teams, defining database structures and specifications, establishing relevant metrics, monitoring system capabilities and performance, integrating new solutions, managing projects, identifying risks, ensuring quality assurance, recommending tools for reuse and automation, and supporting the integration team for better efficiency. In addition, you will ensure optimal Client Engagement by supporting pre-sales teams, negotiating and coordinating with client teams, demonstrating thought leadership, and acting as a trusted advisor. Join Wipro to reinvent your world and be a part of an end-to-end digital transformation partner with bold ambitions. Realize your ambitions in a business powered by purpose and empowered to design your reinvention. Applications from people with disabilities are explicitly welcome.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

Embark on your transformative journey as a Solution Design Business Analyst - Vice President. You will be responsible for driving key strategic change initiatives for regulatory deliverables across Risk, Finance, and Treasury. To excel in this role, you should have at least 10 years of experience in business/data analysis, enabling you to present complex data issues in a simple and engaging manner. Your expertise should extend to front to back system designing, complex business problem solutioning, data gathering, data cleansing, and data validation. You will be expected to analyze large volumes of data, identify patterns, address data quality issues, conduct metrics analysis, and translate your analysis into valuable insights. Additionally, you will play a crucial role in capturing business requirements and translating them into technical data requirements. Collaboration with stakeholders to ensure proposed solutions meet their needs and expectations is a key aspect of this role. You will also be involved in creating operational and process designs to ensure the successful delivery of proposed solutions within the agreed scope, as well as supporting change management activities. Experience within the financial services industry, particularly in the banking sector within a Risk/Finance/Treasury role, will be highly valued. Proficiency in data analysis tools such as SQL, Hypercube, Python, and data visualization/reporting tools like Tableau, Qlikview, Power BI, and Advanced Excel will be beneficial. Familiarity with data modeling and data architecture is also desirable. The primary purpose of this role is to support the organization in achieving its strategic objectives by identifying business requirements and proposing solutions to address business problems and opportunities. Key Accountabilities include identifying and analyzing business problems and client requirements necessitating change within the organization, developing business requirements to address these challenges, collaborating with stakeholders to ensure proposed solutions align with their needs, creating business cases justifying investment in solutions, conducting feasibility studies to assess proposed solutions" viability, reporting on project progress to ensure timely and budget-compliant delivery, and supporting change management activities. As a Vice President, you are expected to contribute to strategic planning, resource allocation, policy management, continuous improvement initiatives, and policy enforcement. Your leadership responsibilities may involve demonstrating a set of leadership behaviors focusing on creating an environment for colleagues to excel. For individual contributors, being a subject matter expert within your discipline, guiding technical direction, leading collaborative assignments, and coaching team members are essential. You will also provide guidance on functional and cross-functional areas of impact and alignment, risk management, and organizational strategies. Demonstrating a comprehensive understanding of the organization's functions, collaborating with various work areas, creating solutions based on analytical thought, building trusting relationships with stakeholders, and upholding Barclays Values and Mindset are crucial aspects of this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a highly motivated and experienced IT Enterprise Architect with a strong focus on end-to-end (E2E) customer service processes. As an IT Enterprise Architect, you will be instrumental in shaping and aligning our IT landscape, encompassing platforms like SAP, ServiceNow, and other customer service-related systems. Your expertise will play a crucial role in driving the digital transformation of our global service processes to ensure scalability, resilience, and exceptional customer experience. Your responsibilities will include enterprise architecture management, deriving IT strategies from business requirements, designing and maintaining end-to-end Enterprise Architecture for customer service processes, leading cross-functional workshops and architecture communities, developing architecture framework and roadmap, guiding platform selection and integration, modeling IT architectures and processes, contributing to solution evaluations, coordinating communication with key decision-makers, and driving documentation and presentations for executive alignment. To be successful in this role, you should possess a degree in computer science or industrial engineering, along with experience as an Enterprise Architect or Solution-/Domain Architect in Customer facing IT landscapes. Familiarity with enterprise architecture methods and frameworks, governance structures, IT Service Management Frameworks, functional or IT implementation experience in customer service processes, and expertise in customer service solutions implementation are essential. Additionally, you should have extensive experience with data architecture, integration concepts, and cloud technologies. In addition to your technical skills, you should have excellent English language proficiency, good command of German, strong communication and presentation skills, organizational talent, and the ability to work effectively in a global environment. Being results-oriented, quality-focused, flexible, and possessing good analytical and conceptual skills are also key attributes for this role.,

Posted 1 month ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

The SF Data Cloud Architect plays a critical role within Salesforce's Professional Services team, assisting in pre-sales activities and leading the design and implementation of enterprise-grade Data Management solutions. As the SF Data Cloud Architect, you are responsible for architecting scalable solutions across enterprise landscapes using Data Cloud. Your primary focus is to ensure that data is prepared for enterprise AI, applying data governance guardrails, and supporting enterprise analytics and automation. This role encompasses the ANZ, ASEAN, and India markets. To excel in this role, you should possess deep expertise in data architecture, project lifecycle management, and comprehensive knowledge of the Salesforce ecosystem. Your strong soft skills, stakeholder engagement abilities, and technical writing proficiency will be crucial. You will collaborate with cross-functional teams to shape the future of the customer's data ecosystem and facilitate data excellence at scale. Your key responsibilities will include being a trusted advisor for Salesforce Data Cloud, providing architectural support to Salesforce Account teams and Customers, leading cross-cloud project delivery, designing enterprise data architecture aligned with business goals, enabling Data Cloud architecture for key domains, collaborating with analytics and AI teams, engaging stakeholders effectively, and creating and maintaining high-quality architecture blueprints and design documents. In terms of technical skills, you should have over 15 years of experience in data architecture or consulting, with expertise in MDM, Data Distribution, Data Modelling, and metadata. You should also have experience in executing data strategies, landscape architecture assessments, and proof-of-concepts. Excellent communication, stakeholder management, presentation, technical writing, and documentation skills are essential. A basic understanding of Hadoop Spark fundamentals and familiarity with Data Platforms like Snowflake, DataBricks, AWS, GCP, MS Azure, and Salesforce Data Cloud are advantageous. Moreover, a working knowledge of enterprise data warehouse, data lake, data hub concepts, and Salesforce Products across different functional domains is beneficial. Ideally, you should hold certifications such as Salesforce Certified Data Cloud Consultant, Salesforce Data Architect, and Salesforce Application Architect. Additional certifications in AWS, Spark/DL, DataBricks, Google Cloud, Snowflake, or similar platforms are preferred.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced IICS Developer, you will be responsible for supporting a critical data migration project from Oracle to Snowflake. This remote opportunity requires working night-shift hours to align with the U.S. team. Your primary focus will be on developing and optimizing ETL/ELT workflows, collaborating with architects/DBAs for schema conversion, and ensuring data quality, consistency, and validation throughout the migration process. To excel in this role, you must possess strong hands-on experience with IICS (Informatica Intelligent Cloud Services), a solid background in Oracle databases (including SQL, PL/SQL, and data modeling), and a working knowledge of Snowflake, specifically data staging, architecture, and data loading. Your responsibilities will also include building mappings, tasks, and parameter files in IICS, as well as understanding data pipeline performance tuning to enhance efficiency. In addition, you will be expected to implement error handling, performance monitoring, and scheduling to support the migration process effectively. Your role will extend to providing assistance during the go-live phase and post-migration stabilization to ensure a seamless transition. This position offers the flexibility of engagement as either a Contract or Full-time role, based on availability and fit. If you are looking to apply your expertise in IICS development to contribute to a challenging data migration project, this opportunity aligns with your skill set and availability. The shift timings for this role are from 7:30 PM IST to 1:30 AM EST, allowing you to collaborate effectively with the U.S. team members.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As SM- MIS Reporting at Axis Max Life Insurance in the BPMA department, you will play a crucial role in leading the reporting function for all distribution functions. Your responsibilities will include defining the vision and roadmap for the business intelligence team, championing a data culture within Max Life, and driving the transformation towards automation and real-time insights. You will lead a team of 10+ professionals, including partners, and coach and mentor them to continuously enhance their skills and capabilities. Your key responsibilities will involve handling distribution reporting requirements across functions and job families to support strategic priorities and performance management. You will ensure the timely and accurate delivery of reports and dashboards, identify opportunities to automate reporting processes, and collaborate with the data team to design and build data products for the distribution teams. Additionally, you will work towards driving a data democratization culture and developing the data infrastructure necessary for efficient analysis and reporting. To qualify for this role, you should possess a Master's degree in a quantitative field, along with at least 7-8 years of relevant experience in working with business reporting teams. Experience in the financial services sector, proficiency in Python and PowerBI, and familiarity with BI tech stack tools like SQL Server reporting services and SAP BO are preferred. You should also have a strong understanding of data architecture, data warehousing, and data lakes, as well as excellent interpersonal, verbal, and written communication skills. Join us at Axis Max Life Insurance to be part of a dynamic team that is focused on leveraging data-driven insights to enhance business performance and drive strategic decision-making.,

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,

Posted 1 month ago

Apply

12.0 - 16.0 years

0 Lacs

maharashtra

On-site

NTT DATA is looking for a Data & AI Technical Solution Architect to join their team in Pune, Maharashtra, India. As a Data & AI Architect, you will be responsible for delivering multi-technology consulting services to clients, providing strategies and solutions for infrastructure and related technology components. Your role will involve collaborating with stakeholders to develop architectural approaches for solutions and working on strategic projects to ensure optimal functioning of clients" technology infrastructure. Key Responsibilities: - Engage in conversations with CEO, Business owners, and CTO/CDO - Analyze complex business challenges and propose effective solutions focusing on client needs - Develop high-level innovative solution approaches for complex business problems - Utilize best practices and creativity to address challenges - Conduct market research, formulate perspectives, and communicate insights to clients - Build strong client relationships and ensure client satisfaction - Contribute to the improvement of internal effectiveness by enhancing methodologies, processes, and tools Minimum Skills Required: - Academic Qualifications: BE/BTech or equivalent in Information Technology and/or Business Management - Scaled Agile certification is desirable - Relevant consulting and technical certifications, such as TOGAF - 12-15 years of experience in a similar role within a large-scale technology services environment - Proficiency in Data, AI, Gen AI, and Agentic AI - Experience in Data Architecture and Solutioning, E2E Data Architecture, and GenAI Solution design - Ability to work on Data & AI RFP responses as Solution Architect - Experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect - Proficiency in Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools - Experience in large-scale consulting and program execution engagements in AI and data - Expertise in multi-technology infrastructure design and client engagement Additional Career Level Description: - Seasoned professional with complete knowledge and understanding of the specialization area - Solves diverse problems using judgment and interpretation - Enhances relationships with senior partners and suggests variations in approach About NTT DATA: NTT DATA is a global innovator of business and technology services with a commitment to helping clients innovate, optimize, and transform for long-term success. With experts in over 50 countries, NTT DATA offers business and technology consulting, data and artificial intelligence solutions, industry solutions, as well as application, infrastructure, and connectivity management. They are a leading provider of digital and AI infrastructure and are part of the NTT Group, investing significantly in R&D to support organizations in their digital transformation journey. Visit us at us.nttdata.com,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Technical Architect at Fiserv, you will bring in-depth and hands-on technical development experience in full stack Java, Cloud, Web, and mobile technologies. Your familiarity with Domain Driven Designs, Event Based Architecture, API, and microservices architecture will be highly valued. Your passion for new technologies and proficiency across Cloud, UI, Application, Security, and Data architecture domains will be essential for this role. In this position, you will be expected to showcase your expertise across a wide range of application and engineering patterns. Your strong track record of collaborating closely with internal and external stakeholders will be vital as you discuss and articulate detailed designs and code with them. Your responsibilities will include leading and overseeing application development throughout the project lifecycle, from gathering functional designs to creating technical designs and specifications, as well as managing development inventory and ensuring high-quality standards from testing to deployment. The ideal candidate will have: - A Bachelor's Degree in College of Engineering and Technology or equivalent work experience. - Minimum of 10 years of IT experience, focusing on designing and deploying enterprise-level business or technical applications. - Hands-on experience with various distributed and cloud technologies, including Cloud workloads, Containerization, Linux, Java/JavaScript, HTML5, CSS3, MVC, Angular JS, React, Mobile and Application Middleware, ESB, Datapower XML/JSON, SOA and API management, and distributed relational and noSQL databases such as Postgres/Yugabyte, Oracle, MySQL, DB2, PhoneGap, IOS/Android SDKs, among others. - Proficiency in microservices, mobile and web app security concepts, session management, performance tuning, automated testing techniques, high availability engineering, and database technologies for mobile and web apps. - Knowledge of cryptography, key management, security solutions on both mobile and server sides, as well as understanding security protocols and cryptography like PKI, SSL, RSA, authentication, encryption, and digital signatures. - Understanding of emerging technologies like rules, AI, and Machine Learning, and the ability to relate technology to business needs. - Strong knowledge of application development technologies, tools, methodologies, and all functional areas within an IT organization. Excellent analytical ability, communication skills, and interpersonal skills to build relationships with team members and customers. - Experience in agile/scrum and waterfall life cycle application development, along with mentoring junior staff. If you are a proactive and experienced Technical Architect with a solid background in a diverse range of technologies and a passion for innovation, we encourage you to apply for this exciting opportunity at Fiserv.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Database Administrator at NTT DATA, you will be a seasoned subject matter expert responsible for ensuring the availability, integrity, and performance of critical data assets. You will work closely with cross-functional teams to support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Collaboration with Change Control, Release Management, Asset and Configuration Management, and Capacity and Availability Management will be essential to meet the needs of users and ensure database security and integrity. Key responsibilities include performing installation, configuration, and maintenance of database management systems, collaborating with software developers/architects to optimize database-related applications, designing backup and disaster recovery strategies, monitoring database performance, and providing technical support to end-users. You will also participate in database software upgrades, data validation activities, and work collaboratively with cross-functional teams to support database-related initiatives. To excel in this role, you should have seasoned proficiency in database administration tasks, a strong understanding of SQL, database security principles, and backup strategies. Effective communication, problem-solving, and analytical skills are crucial, along with the ability to manage multiple projects concurrently while maintaining attention to detail. Academic qualifications in computer science or related fields, along with relevant certifications like MCSE DBA or Oracle Certified Professional, are preferred. NTT DATA is a trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse workforce and a focus on R&D, NTT DATA is dedicated to moving organizations confidently into the digital future. As an Equal Opportunity Employer, NTT DATA offers a dynamic workplace where employees can thrive, grow, and make a difference.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Lead Cloud Engineer at our organization, you will have the opportunity to design and build cloud-based distributed systems that address complex business challenges for some of the world's largest companies. Drawing upon your expertise in software engineering, cloud engineering, and DevOps, you will play a pivotal role in crafting technology stacks and platform components that empower cross-functional AI Engineering teams to develop robust, observable, and scalable solutions. Working within a diverse and globally distributed engineering team, you will engage in the full engineering life cycle, spanning from designing and developing solutions to optimizing and deploying infrastructure at the scale of leading global enterprises. Your core responsibilities will include designing cloud solutions and distributed systems architecture for full-stack AI software and data solutions. You will be involved in implementing, testing, and managing Infrastructure as Code (IAC) for cloud-based solutions encompassing CI/CD, data integrations, APIs, web and mobile apps, and AI solutions. Collaborating with various teams such as product managers, data scientists, and fellow engineers, you will define and implement analytics and AI features that align with business requirements and user needs. Furthermore, your role will involve leveraging Kubernetes and containerization technologies to deploy, manage, and scale analytics applications in cloud environments, ensuring optimal performance and availability. You will be responsible for developing and maintaining APIs and microservices to expose analytics functionality to internal and external consumers, in adherence to best practices for API design and documentation. Implementing robust security measures to safeguard sensitive data and ensure compliance with data privacy regulations will also be a key aspect of your responsibilities. In addition, you will continuously monitor and troubleshoot application performance, identifying and resolving issues that impact system reliability, latency, and user experience. Engaging in code reviews and contributing to the establishment and enforcement of coding standards and best practices will be essential to ensure the delivery of high-quality, maintainable code. Keeping abreast of emerging trends and technologies in cloud computing, data analytics, and software engineering will enable you to identify opportunities for enhancing the capabilities of the analytics platform. Collaborating closely with business consulting staff and leaders within multidisciplinary teams, you will assess opportunities and develop analytics solutions for our clients across various sectors. Your role will also involve influencing, educating, and providing direct support for the analytics application engineering capabilities of our clients. To excel in this role, you should possess a Master's degree in Computer Science, Engineering, or a related technical field, along with at least 6+ years of experience, including a minimum of 3+ years at the Staff level or equivalent. Proven experience as a cloud engineer and software engineer in product engineering or professional services organizations is essential. Additionally, experience in designing and delivering cloud-based distributed solutions and possessing certifications in GCP, AWS, or Azure would be advantageous. Deep familiarity with the software development lifecycle, configuration management tools, monitoring and analytics platforms, CI/CD deployment pipelines, backend APIs, Kubernetes, and containerization technologies is highly desirable. Your ability to work closely with internal and client teams, along with strong interpersonal and communication skills, will be critical for collaboration and effective technical discussions. Curiosity, proactivity, critical thinking, and a strong foundation in computer science fundamentals are qualities that we value. If you have a passion for innovation, a commitment to excellence, and a drive to make a meaningful impact in the field of cloud engineering, we invite you to join our dynamic team at Bain & Company.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

You are a results-driven Data Project Manager (PM) responsible for leading data initiatives within a regulated banking environment, focusing on leveraging Databricks and Confluent Kafka. Your role involves overseeing the successful end-to-end delivery of complex data transformation projects aligned with business and regulatory requirements. In this position, you will be required to lead the planning, execution, and delivery of enterprise data projects using Databricks and Confluent. This includes developing detailed project plans, delivery roadmaps, and work breakdown structures, as well as ensuring resource allocation, budgeting, and adherence to timelines and quality standards. Collaboration with data engineers, architects, business analysts, and platform teams is essential to align on project goals. You will act as the primary liaison between business units, technology teams, and vendors, facilitating regular updates, steering committee meetings, and issue/risk escalations. Your technical oversight responsibilities include managing solution delivery on Databricks for data processing, ML pipelines, and analytics, as well as overseeing real-time data streaming pipelines via Confluent Kafka. Ensuring alignment with data governance, security, and regulatory frameworks such as GDPR, CBUAE, and BCBS 239 is crucial. Risk and compliance management are key aspects of your role, involving ensuring regulatory reporting data flows comply with local and international financial standards and managing controls and audit requirements in collaboration with Compliance and Risk teams. The required skills and experience for this role include 7+ years of Project Management experience within the banking or financial services sector, proven experience in leading data platform projects, a strong understanding of data architecture, pipelines, and streaming technologies, experience in managing cross-functional teams, and proficiency in Agile/Scrum and Waterfall methodologies. Technical exposure to Databricks (Delta Lake, MLflow, Spark), Confluent Kafka (Kafka Connect, kSQL, Schema Registry), Azure or AWS Cloud Platforms, integration tools, CI/CD pipelines, and Oracle ERP Implementation is expected. Preferred qualifications include PMP/Prince2/Scrum Master certification, familiarity with regulatory frameworks, and a strong understanding of data governance principles. The ideal candidate will hold a Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Key performance indicators for this role include on-time, on-budget delivery of data initiatives, uptime and SLAs of data pipelines, user satisfaction, and compliance with regulatory milestones.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Software Solution Architect at Fiserv, you will be responsible for leading and building new cloud-native applications within the Global Issuer organizations Architecture team. Your role will involve designing card issuance solutions that cover credit cards, loans, and unsecured lending. To excel in this position, you must possess a deep understanding and hands-on experience in Domain Driven Designs, Event Based Architecture, API, and microservices architecture. The ideal candidate will have a passion for exploring new technologies and a demonstrated exposure to various domains including Cloud, UI, Application, Security, and Data architecture. You will collaborate in a cross-functional environment to define end-to-end solutions and collaborate with development teams to refine them. To qualify for this role, you should hold a Bachelor's Degree from the College of Engineering and Technology or have equivalent work experience. Additionally, a minimum of 10 years of IT experience in designing and deploying enterprise-level business or technical applications is required. Experience in creating context designs, UML, and Entity Relationship diagrams will be beneficial for this position. You should have exposure to a diverse range of distributed and cloud technologies such as Cloud, Containerization, Microservices, Kafka, Java/JavaScript, Quarkus/Spring, XML/JSON, Apigee/Mulesoft, ESB/Datapower, distributed databases like Postgres/Yugabyte, HTML5, CSS3, MVC, Angular JS, React, Mobile and Application Middleware, PhoneGap/IOS/Android SDKs, CI/CD, etc. The role demands a strong understanding of emerging technologies and their business implications. You should be well-versed in application development technologies, tools, and methodologies, and possess excellent analytical, communication, and interpersonal skills to collaborate effectively with team members and customers. Experience in both agile/scrum and waterfall life cycle application development is required, along with a willingness to mentor junior staff. Join Fiserv as a Software Solution Architect to leverage your expertise and contribute to the development of innovative solutions in a dynamic and collaborative environment.,

Posted 1 month ago

Apply

10.0 - 15.0 years

0 Lacs

pune, maharashtra

On-site

The Data Lead for the AMEA (Asia, Middle East, and Africa) and India region holds a pivotal leadership position responsible for overseeing data management, governance, analytics, and strategy initiatives across the region. Reporting directly to the CIO of AMEA & India, you will collaborate closely with the Global Business Units (GBUs) and support functions to ensure the effective and ethical utilization of data in driving business growth, operational efficiency, and informed decision-making. This role demands a forward-thinking leader with profound expertise in data science, architecture, and governance, complemented by strong leadership and communication abilities. Your primary responsibilities will revolve around the following key areas: **Data Strategy and Governance** Develop and execute a comprehensive data strategy aligned with both the Group's data strategy and the growth plans of the AMEA & India region. Implement the Group Data Policy throughout the AMEA & India region. Establish data governance policies to uphold data quality, privacy, and security across all data assets. Collaborate with regional and global stakeholders to standardize data practices and standards across the AMEA organization. Oversee the development and maintenance of data architecture and infrastructure to ensure scalability and robustness. Monitor regulatory compliance concerning data privacy and security, ensuring adherence to applicable laws and regulations. **Data Management** Lead the design, implementation, and management of data management systems and processes encompassing data warehousing, data lakes, and data integration platforms. Ensure the accurate and timely collection, storage, and retrieval of data from diverse sources across the AMEA region. Implement best practices for data lifecycle management, including retention, archiving, and disposal. Manage the regional data team, comprising data analysts, data scientists, and data engineers, to ensure alignment with the organization's data strategy and objectives. Ensure that data within the region is collected, stored, and analyzed in compliance with data privacy laws and regulations. Identify and prioritize data-related opportunities and risks within the region, collaborating with executives and business leaders to devise data-driven solutions. Promote a data culture within the region by educating and training employees on effective data use and fostering interdepartmental collaboration. Ensure the digital and data integration of newly acquired companies and the data disintegration of sold entities. **Data Analytics and Insights** Drive the development and deployment of advanced analytics and business intelligence solutions to facilitate data-driven decision-making. Lead a team of data scientists, analysts, and engineers to derive actionable insights from data, enabling informed decision-making by business leaders. Promote a culture of data literacy and data-driven innovation across the organization. **Leadership and Collaboration** Provide visionary leadership to the data team by setting clear goals, expectations, and performance metrics. Collaborate with senior executives and business leaders within the GBUs and support functions to identify data-driven opportunities and challenges. Work with the entities Data Leads to ensure consistency in data policies, standards, and procedures across the organization. Stay abreast of the latest trends and technologies in the data field, identifying opportunities to leverage emerging technologies for improved data-driven decision-making in the region. Cultivate and maintain strong relationships with external partners, vendors, and industry experts to remain informed about emerging trends and technologies. **Qualifications** - Master's degree in Data Science, Computer Science, Information Technology, or a related field. - Minimum of 10 years of experience in data management, analytics, or a related field, with at least 5 years in a senior leadership role. - Proven track record in developing and executing data strategies that drive business value. - Profound knowledge of data governance, architecture, security, and regulatory compliance. - Strong expertise in data analytics, machine learning, and AI. - Excellent leadership, communication, and interpersonal skills. - Ability to thrive in a diverse and multicultural environment. **Skills and Competencies** - Strategic Vision - Technical Expertise - Leadership - Communication - Collaboration - Problem-Solving - Analytical Skills - Strategic Thinking - Leadership Skills - Communication Skills - Change Management Skills - Business Acumen This role reports to the CIO of AMEA & India and is based in Pune, India, under the GBU Renewables division of ENGIE Energy India Private Limited. The ideal candidate should possess a wealth of experience, with a seniority level exceeding 15 years, and hold a Master's Degree education level.,

Posted 1 month ago

Apply

6.0 - 10.0 years

35 - 37 Lacs

Bengaluru

Remote

Role & responsibilities Design and implement end-to-end SAP ECC, BW, and HANA data architectures, ensuring scalable and robust solutions. • Develop and optimize data models, ETL processes, and reporting frameworks across SAP landscapes. • Lead integration efforts, defining and applying best practices for connecting SAP systems with external platforms and cloud services. • Collaborate with business stakeholders to translate requirements into technical solutions, focusing on data quality and governance. • Provide technical leadership and mentorship to project teams, ensuring alignment with enterprise integration patterns and standards.

Posted 1 month ago

Apply

8.0 - 12.0 years

0 - 3 Lacs

Pune

Work from Office

Greetings for the Day! At least 8 years experience in a similar role in data management/analytics/architecture or engineering Experience with solution design and data modelling Experience working with metadata and an appreciation for metadata frameworks and ontologies A technical understanding of data transport mechanisms An understanding of data mesh and data product concepts Technical or physical data lineage experience is preferable Evidenced experience in documenting requirements and designing solutions to meet objectives in an efficient and robust way Experience within project or risk management change environment Recognition of being strong communicator, with excellent written and oratory ability A track record as an Agile and change management practitioner Signs of having the enthusiasm to identify, learn and coach others in new data, modelling and risk processes

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Senior Data Engineer Our Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join our growing data engineering team. You ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk . You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform. What you get to do every single day: Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems What you bring to the role: Basic Qualifications 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments 5+ years of experience in Data Modeling and Data Architecture in a production environment 5+ years in writing complex SQL queries 5+ years of experience with Cloud columnar databases (We use Snowflake) 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams. Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualifications Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Demonstrated experience in one or many business domains (Finance, Sales, Marketing) 3+ completed production-grade projects with dbt Expert knowledge in python What does our data stack looks like: ELT (Snowflake, Fivetran, dbt, Airflow, Kafka, HighTouch) BI (Tableau, Looker) Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions) Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law.

Posted 1 month ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

We are looking for a Sr. Data Engineer to be part of our FP&As digital transformation, reporting, and analysis team. This role reports to the Director of FP&A Digitization, Reporting, and Analysis. This opportunity is ideal for someone with a strong background in developing the data architecture- flow ETL & conceptual, logical, and physical data models for FP&As data mart. In this role, you can expect to... Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Develop best practices for the data structure to ensure consistency within the system Required education Bachelor's Degree Required technical and professional expertise We are looking for a Sr. Data Engineer to be part of our FP&As digital transformation, reporting, and analysis team. This role reports to the Director of FP&A Digitization, Reporting, and Analysis. This opportunity is ideal for someone with a strong background in developing the data architecture- flow ETL & conceptual, logical, and physical data models for FP&As data mart. In this role, you can expect to... Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Develop best practices for the data structure to ensure consistency within the system

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies