Jobs
Interviews

1814 Data Architecture Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

6 - 10 Lacs

Kochi

Work from Office

As a Senior member of Front-end Experience’s Enablement team, you will provide support and enablement for delivering complex, cross-product UI features across portfolio of products. You’ll work closely with engineers and designers across the company to leverage your experience and expertise in building accessible, delightful, and maintainable UIs. What you’ll do (responsibilities): Deliver on feature requests that unblock customers and facilitate deals, enhancing the product's user experience. Collaborate closely with Design, Product, and other cross-functional teams to innovate and deliver high-quality, customer-centric solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Develop and implement well-tested solutions to ensure reliability and performance. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 10-14 years of software engineering experience with a proven track record of technical or engineering lead roles. Experience with diverse technology stacks and project types is preferred. Proficiency in JavaScript is necessary. Experience with the Ember framework is preferred, or a strong interest and ability to get up to speed with it. Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. Proven ability to lead by example, mentor junior engineers, and contribute to a positive team culture. Commitment to developing well-tested solutions to ensure high reliability and performance.

Posted 1 month ago

Apply

5.0 - 9.0 years

4 - 8 Lacs

Kochi

Work from Office

As a Senior member of Front-end Experience’s Enablement team, you will provide support and enablement for delivering complex, cross-product UI features across portfolio of products. You’ll work closely with engineers and designers across the company to leverage your experience and expertise in building accessible, delightful, and maintainable UIs. What you’ll do (responsibilities): Deliver on feature requests that unblock customers and facilitate deals, enhancing the product's user experience. Collaborate closely with Design, Product, and other cross-functional teams to innovate and deliver high-quality, customer-centric solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Develop and implement well-tested solutions to ensure reliability and performance. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5-9 years of software engineering experience with a proven track record of technical or engineering lead roles. Experience with diverse technology stacks and project types is preferred. Proficiency in JavaScript is necessary. Experience with the Ember framework is preferred, or a strong interest and ability to get up to speed with it. Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. Proven ability to lead by example, mentor junior engineers, and contribute to a positive team culture. Commitment to developing well-tested solutions to ensure high reliability and performance.

Posted 1 month ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

About The Role This is an Internal document. Job TitleSenior Data Engineer About The Role As a Senior Data Engineer, you will play a key role in designing and implementing data solutions @Kotak811. You will be responsible for leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams to deliver high-quality and scalable data infrastructure. Your expertise in data architecture, performance optimization, and data integration will be instrumental in driving the success of our data initiatives. Responsibilities 1. Data Architecture and Designa. Design and develop scalable, high-performance data architecture and data models. b. Collaborate with data scientists, architects, and business stakeholders to understand data requirements and design optimal data solutions. c. Evaluate and select appropriate technologies, tools, and frameworks for data engineering projects. d. Define and enforce data engineering best practices, standards, and guidelines. 2. Data Pipeline Development & Maintenancea. Develop and maintain robust and scalable data pipelines for data ingestion, transformation, and loading for real-time and batch-use-cases b. Implement ETL processes to integrate data from various sources into data storage systems. c. Optimise data pipelines for performance, scalability, and reliability. i. Identify and resolve performance bottlenecks in data pipelines and analytical systems. ii. Monitor and analyse system performance metrics, identifying areas for improvement and implementing solutions. iii. Optimise database performance, including query tuning, indexing, and partitioning strategies. d. Implement real-time and batch data processing solutions. 3. Data Quality and Governancea. Implement data quality frameworks and processes to ensure high data integrity and consistency. b. Design and enforce data management policies and standards. c. Develop and maintain documentation, data dictionaries, and metadata repositories. d. Conduct data profiling and analysis to identify data quality issues and implement remediation strategies. 4. ML Models Deployment & Management (is a plus) This is an Internal document. a. Responsible for designing, developing, and maintaining the infrastructure and processes necessary for deploying and managing machine learning models in production environments b. Implement model deployment strategies, including containerization and orchestration using tools like Docker and Kubernetes. c. Optimise model performance and latency for real-time inference in consumer applications. d. Collaborate with DevOps teams to implement continuous integration and continuous deployment (CI/CD) processes for model deployment. e. Monitor and troubleshoot deployed models, proactively identifying and resolving performance or data-related issues. f. Implement monitoring and logging solutions to track model performance, data drift, and system health. 5. Team Leadership and Mentorshipa. Lead data engineering projects, providing technical guidance and expertise to team members. i. Conduct code reviews and ensure adherence to coding standards and best practices. b. Mentor and coach junior data engineers, fostering their professional growth and development. c. Collaborate with cross-functional teams, including data scientists, software engineers, and business analysts, to drive successful project outcomes. d. Stay abreast of emerging technologies, trends, and best practices in data engineering and share knowledge within the team. i. Participate in the evaluation and selection of data engineering tools and technologies. Qualifications1. 3-5 years" experience with Bachelor's Degree in Computer Science, Engineering, Technology or related field required 2. Good understanding of streaming technologies like Kafka, Spark Streaming. 3. Experience with Enterprise Business Intelligence Platform/Data platform sizing, tuning, optimization and system landscape integration in large-scale, enterprise deployments. 4. Proficiency in one of the programming language preferably Java, Scala or Python 5. Good knowledge of Agile, SDLC/CICD practices and tools 6. Must have proven experience with Hadoop, Mapreduce, Hive, Spark, Scala programming. Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. 7. Proven experience in development of conceptual, logical, and physical data models for Hadoop, relational, EDW (enterprise data warehouse) and OLAP database solutions. 8. Good understanding of distributed systems 9. Experience working extensively in multi-petabyte DW environment 10. Experience in engineering large-scale systems in a product environment

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Hyderabad

Work from Office

The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, crafting, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Act as a technical expert on one or more applications used by DTCC Work with the Business System Analyst to ensure designs satisfy functional requirements Partner with Infrastructure to identify and deploy optimal hosting environments Tune application performance to eliminate and reduce issues Research and evaluate technical solutions consistent with DTCC technology standards Align risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Apply different software development methodologies dependent on project needs Contribute expertise to the design of components or individual programs, and participate in the construction and functional testing Support development teams, testing, solving, and production support Create applications and construct unit test cases that ensure compliance with functional and non-functional requirements Work with peers to mature ways of working, continuous integration, and continuous delivery Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: Expertise in Snowflake DB and its various architecture principles, capabilities Experience with data warehousing, data architecture, ETL data pipeline and/or data engineering environments at enterprise scale that are built on Snowflake Ability to create Strong SQL Procedures in Snowflake, Build a Data Pipeline efficiently in a cost-optimizing & performance efficient way Proficient understanding of code versioning tools - Git, Mercurial, SVN Knowledge of SDLC, Testing & CI/CD aspects such as Jenkins, BB , JIRA Fosters a culture where integrity and transparency are encouraged. Stays ahead of on changes in their own specialist area and seeks out learning opportunities to ensure knowledge is up-to-date. Invests in effort to individually coach others. Build collaborative teams across the organization. Communicates openly keeping everyone across the organization advised.

Posted 1 month ago

Apply

15.0 - 20.0 years

18 - 22 Lacs

Hyderabad

Work from Office

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Data Services Good to have skills : Microsoft Azure Databricks, Python (Programming Language), Microsoft SQL ServerMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure seamless integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that drive the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with Microsoft Azure Databricks, Python (Programming Language), Microsoft SQL Server.- Strong understanding of data modeling techniques and best practices.- Experience with cloud-based data storage solutions and data processing frameworks.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Data Services.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

13 - 20 Lacs

Bengaluru

Remote

Develop and implement solutions within Salesforce Data Cloud,focusing on datadriven insights and integrations Design,develop and maintain custom solutions using Apexand Lightning Web Component Troublesht issues in Salesforce Data Cloud environments Required Candidate profile 3 years of experience as a Salesforce Developer with a strong focus on Salesforce Data Cloud Completed 2 successful Salesforce Data Cloud projects. Strong development skills in Apex and LWC

Posted 1 month ago

Apply

16.0 - 20.0 years

50 - 60 Lacs

Noida

Work from Office

Location: Noida Position Summary : MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights. The operating models consists of business aligned data officers- US, Japan and LatAm & Corporate functions enabled by enterprise COEs- data engineering, data governance and data science Role Value Proposition MetLife Global Capability Center (MGCC) is looking for an experienced practitioner to lead a portfolio of data and analytics work. This role is integral to US Business Data Officer organization. US Data officers organization is structured by pillars, each pillar accountable to delivering data and analytics solutions for a product and/or business function such as Dental, Disability, Life, Pet or Employer: billing, call center, sales, underwriting etal. This role will directly report into MGCC D&A leader and dots into respective pillar leader(s) in US Data Officers organization. In this role, the individual, along with pillar leaders in US, is jointly accountable to for data and analytics solution delivery and value creation for that/those pillar(s), consisting of ~40+ members in MGCC. Specifically, the individual in this role is accountable for delivering quality data & analytics solutions from MGCC. By ensuring alignment with business goals and fostering strong partnerships and collaboration with US D&A, Operations, Privacy, Risk, Technology- the focus will be on driving execution and delivering robust solutions. The role spans all relevant D&A functions from data infrastructure design, data engineering/modeling/analysis to data science and daily support for all deliverables. Job Responsibilities Leadership: a key member of the MGCC D&A leadership, extended US Data Officer leadership teams- contribute to shaping up the strategic imperatives and driving commercial value using data and analytics. Build strategic partnerships with various Operations, Risk, Technology leaders at Enterprise and MGCC levels to ensure alignment and collaboration across all functions Co-develop and execute the Data and Analytics roadmap for pillars (Dental & Vision, Engagement) aligned with business objectives and Global D&A strategy Own the Data & Analytics delivery for one of the sub pillars Vision Objectives of typical business problems are improving associate (e.g. claims analyst, adjuster, call center associate) productivity, expediting claims servicing, enhancing provider and customer experience, identifying opportunities for connected benefits (e.g.: accident & health and disability), improving communications, engagement. Data solutions to such business problems require orchestration of capabilities ranging from modernizing data infrastructure, preparing trusted data using right data governance and quality standards and practices, delivering information & insights through reports and dashboards, augmenting or automating decision-making using machine learning techniques, integration with technologies & tools for deploying and end to end enterprise grade solution. In this context, in collaboration with Global D&A, enterprise COEs, technology and architecture review boards, basis specific business problems Assess current state technology and data infrastructure (on-prem, cloud, legacy technologies, technology currency), evolving information needs, performance considerations and recommend future state solution & data architecture Architect, design and direct the development and delivery of efficient, scalable, and trustworthy data layers. Review design options and direct team to choose the right design for data engineering & building right data pipes: Extract, Transform and Load (ETL) or Extract, Load and transform (ELT) Review current state data architecture, data stores- warehouses, hubs, lakes etal and recommended opportunities for efficient data storage, processing and retrieval Align team in MGCC (across locations) to Global D&A, US Data Officers organization and integrate talent from development and engagement standpoint Provide Subject Matter Expertise and leadership support to ~ 150 D&A talent based in Noida, even outside the scope of direct reports and play the role of coach/ mentor locally Team Development & Cultural Transformation: Enable cultural shift from a traditional operations & technology-focused data team to a business-outcome-driven mindset, develop training programs to enhance domain, data knowledge. Acquire, engage and develop contemporary and fit for purpose talent Proactive D&A leadership: Proactively identify and propose data-driven solutions to business challenges, leading data and analytics organization beyond traditional service provider role to become strategic innovation partner driving business transformation Education, Technical Skills & Other Critical Requirement Education degree in an information technology/computer science or relevant domain Experience (In Years) 16+years of solutions development experience, including 12+ years in data products & solutions delivery. 12+ years of insurance industry experience, or other consumer financial services experience with similar complexity 5+ years of people leadership, talent development and engagement Technical Skills Azure, Hadoop/Databricks, Hive, SQL Solution architecture, data architecture, data analysis & data engineering skills. Solutioning skills to build trustworthy and efficient data layers for a variety of consumption needs such as reporting, advanced analytics, data APIs, AI et al Expertise at data architecture principles: centralized and decentralized approaches & applicability Data governance- data classification, data lineage, data profiling, data quality, data transformation data validation, data ops. Expertise at building, maintaining range of data stores - Data warehouses, data marts, data lakes, data mesh etc. Expertise working with large and complex internal, external, structured, and unstructured datasets Responsible development of Data solutions Demonstrated success in leading data and analytics teams and delivering business value through creative data solutions, while effectively interacting with multiple stakeholders in a complex organization Agile methodologies & tools. Understanding of legacy insurance platforms and modernization initiatives Excellent in stakeholder management and executive communication. Strong conceptual and creative problem-solving skills; empathy led engagement with focus on stakeholder motivation, needs and aspirations. Knowledge of HIPAA and other relevant consumer data regulations, emerging Cloud, Data trends Other Critical Requirements Preferred: Appreciation for data science, machine learning techniques and their business applications within banking and financial services & insurance industries; familiarity with new age AI techniques such as Generative AI (GenAI), large language models (LLMs) and their business applications

Posted 1 month ago

Apply

3.0 - 8.0 years

0 - 0 Lacs

Noida, New Delhi, Hyderabad

Work from Office

Required Skills - Adobe AEM BE()AEM FE React()Adobe Target()Adobe Analytics()Adobe RTCDP()Forms Lead()AJO Lead()Pure Workfront Developers()Implementation experts()Data Architect(Data Modelling)()AWS is mandatory) ()Adobe Campaign Classic-Implement.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Hyderabad, Pune

Work from Office

Overall Exp - 6+ Year Location - Pune, Hyderabad JD Details : Understanding current Test data management processes ,tools ,frameworks Understanding of current test data processes Work with the current TDM tool/frameworks to build on existing TDM landscape Define and standardize SLA and Metric Analysis of TDM requests. Coordinate with offshore team wrt to test data and environment challenges Mandatory Skills: Data Centric testing. Experience: 5-8 Years.

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 37 Lacs

Bengaluru

Work from Office

100% Remote Snowflake / SQL Architect • Architect and manage scalable data solutions using Snowflake and advanced SQL, optimizing performance for analytics and reporting. • Design and implement data pipelines, data warehouses, and data lakes, ensuring efficient data ingestion and transformation. • Develop best practices for data security, access control, and compliance within cloud-based data environments. • Collaborate with cross-functional teams to understand business needs and translate them into robust data architectures. • Evaluate and integrate third-party tools and technologies to enhance the Snowflake ecosystem and overall data strategy.

Posted 1 month ago

Apply

8.0 - 10.0 years

13 - 18 Lacs

Chennai

Work from Office

Core Qualifications 12+ years in software/data architecture with hands on experience. Agentic AI & AWS Bedrock (MustHave): Demonstrated handson design, deployment, and operational experience with Agentic AI solutions leveraging AWS Bedrock and AWS Bedrock Agents . Deep expertise in cloud-native architectures on AWS (compute, storage, networking, security). Proven track record defining technology stacks across microservices, event streaming, and modern data platforms (e.g., Snowflake, Databricks). Proficiency with CI/CD and IaC (Azure DevOps, Terraform). Strong knowledge of data modeling, API design (REST/GraphQL), and integration patterns (ETL/ELT, CDC, messaging). Excellent communication and stakeholder-management skillsable to translate complex tech into business value. Preferred Media or broadcasting industry experience. Familiarity with Salesforce, or other enterprise iPaaS solutions. Certifications: AWS/Azure/GCP Architect , Salesforce Integration Architect , TOGAF . Mandatory Skills: Generative AI.

Posted 1 month ago

Apply

8.0 - 10.0 years

17 - 22 Lacs

Bengaluru

Work from Office

The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Dataiku. Experience: 8-10 Years.

Posted 1 month ago

Apply

8.0 - 10.0 years

17 - 22 Lacs

Bengaluru

Work from Office

The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience: 8-10 Years.

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

Siemens Energy is looking for a talented Senior MLOps Engineer to join the Digital Core team and contribute to shaping the data architecture and strategy within the organization. In this role, you will collaborate closely with stakeholders to drive developments in the Machine Learning environment. Your responsibilities will include partnering with business stakeholders, defining backlog priorities, consulting on AI ML solutions, supporting test automation, building CI CD pipelines, and working on PoCs/MVPs using various hyperscale offerings. The Siemens Energy Data Analytics & AI team is at the forefront of driving the energy transition and addressing environmental challenges. As part of this team, you will have the opportunity to work on innovative projects that reimagine the future and contribute to a sustainable world. Your impact will involve onboarding new AI ML use cases in AWS/Google Cloud Platform, defining MLOps architecture, deploying models, working with AI ML services like AWS Sagemaker and GCP AutoML, developing PoCs/MVPs, implementing CI CD pipelines, writing Infrastructure as code using AWS CDK scripts, providing consultancy on AI ML solutions, and supporting test automation and code deployment. To excel in this role, you should have a Bachelor's degree in Computer Science, Mathematics, Engineering, Physics, or related field (a Master's degree is a plus), AWS or GCP Certifications in ML AI, around 10 years of hands-on experience in ML/AI development and operations, expertise in ML Life Cycle and MLOps, proficiency in Python coding and Linux administration, experience with CI CD pipelines, and familiarity with JIRA, Confluence, and Agile delivery model. Additionally, you should possess excellent interpersonal, communication, and collaborative skills, be result-driven, and fluent in both spoken and written English. As part of the Data Platforms and Services organization at Siemens Energy, you will contribute to the company's mission of becoming a data-driven organization and supporting customers in transitioning to a more sustainable world through innovative technologies. Siemens Energy is a global energy technology company committed to providing sustainable, reliable, and affordable energy solutions while protecting the climate. With a diverse team of over 92,000 employees across 90+ countries, Siemens Energy is focused on decarbonization, new technologies, and energy transformation. Siemens Energy values diversity and inclusion, celebrating the unique contributions of individuals from over 130 nationalities. The company is dedicated to providing equal opportunities and encourages applications from individuals with disabilities. Join Siemens Energy in energizing society and driving positive change in the energy sector.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The AIML Architect-Dataflow, BigQuery plays a crucial role within the organization by focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. Your primary responsibility will involve combining advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that improve decision-making processes across various departments. Building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in data workflows will be key. Collaboration with data engineers, data scientists, and application developers is essential to align technical vision with business goals. Your expertise in cloud-native architectures will be instrumental in driving innovation, efficiency, and insights from vast datasets. The ideal candidate will have a strong background in data processing and AI/ML methodologies and be adept at translating complex technical requirements into scalable solutions that meet the organization's evolving needs. Responsibilities: - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms to extract insights from large datasets. - Optimize data storage and retrieval processes to enhance performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Align data workflows with business objectives through collaboration with cross-functional teams. - Conduct technical evaluations of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship to junior data engineers and analysts. - Stay updated with industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Requirements: - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, specifically BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience implementing machine learning solutions in cloud environments. - Proficient in programming languages like Python, Java, or Scala. - Expertise in SQL and query optimization techniques. - Familiarity with big data workloads and distributed computing. - Knowledge of modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

You have over 8 years of experience and are currently based in Balewadi, Pune. You possess a strong understanding of Data Architecture and are adept at leading data-driven projects. Your expertise lies in Data Modelling paradigms such as Kimball, Inmon, Data Marts, Data Vault, Medallion, etc. You have hands-on experience with Cloud-Based data strategies, with a preference for AWS. Designing data pipelines for ETL is second nature to you. You excel in ingestion, transformation, and ensuring data quality. Proficiency in SQL is a must, particularly in PostGreSQL development, query optimization, and index design. You are skilled in working with intermediate to complex levels of SQL and Postgres PL/SQL for complex warehouse workflows. Your advanced SQL capabilities include using concepts like RANK, DENSE_RANK, and applying statistical concepts through SQL. You have experience with Postgres SQL extensions like PostGIS and are well-versed in writing ETL pipelines that combine Python + SQL. Familiarity with data manipulation libraries in Python such as Pandas, Polars, DuckDB is desirable. Additionally, you have experience in designing Data visualization using tools like Tableau and PowerBI. In terms of responsibilities, you actively participate in designing and developing features within the existing Data Warehouse. You provide leadership in establishing connections between Engineering, product, and analytics/data scientists teams. Your role involves designing, implementing, and updating batch ETL pipelines, defining and implementing data architecture, and collaborating with engineers and data analysts to create reliable datasets. You work with various data orchestration tools like Apache Airflow, Dagster, Prefect, and others. You thrive in a fast-paced start-up environment and are passionate about your work. While a background or experience in the telecom industry is a plus, it is not a requirement. Your affinity for automation and monitoring sets you apart in your role.,

Posted 1 month ago

Apply

12.0 - 16.0 years

0 Lacs

maharashtra

On-site

As an experienced professional with 12-14 years of experience, your primary role will involve developing a detailed project plan encompassing tasks, timelines, milestones, and dependencies. You will be responsible for solutions architecture design and implementation, understanding the source, and outlining the ADF structure. Your expertise will be crucial in designing and scheduling packages using ADF. Facilitating collaboration and communication within the team is essential to ensure a smooth workflow. You will also be focusing on application performance optimization and monitoring resource allocation to ensure tasks are adequately staffed. It will be part of your responsibility to create detailed technical specifications, business requirements, and unit test report documents. Your role will require you to ensure that the project complies with best practices, coding standards, and technical requirements. Collaboration with technical leads to address technical issues and mitigate risks will be a key aspect of your job. Your primary skill set should revolve around Data Architecture, with additional expertise in Data Modeling, ETL, Azure Log Analytics, Analytics Architecture, BI & Visualization Architecture, Data Engineering, Costing Management, databricks, Datadog, Apache Spark, Azure Datalake, and Azure Data Factory. Your proficiency in these areas will be instrumental in successfully executing your responsibilities.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. Additionally, you will lead a high-performing team, foster a collaborative and innovative culture, and ensure data integrity, consistency, and availability across the organization. Our existing MDM solution is based on Microsoft Data Lake gen 2, Snowflake as the DWH, and Power BI managing data from most of our core applications. You will be managing the existing solution and driving further development to handle additional data and capabilities, as well as supporting our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Take part in developing and implementing the MDM and analytics strategy aligned with the overall team and organizational goals. - Collaborate with the Enterprise architect to align on the overall strategy and application landscape securing that MDM and data analytics fit into the overall ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives and support. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Actively develop business cases and proposals for IT investments and present them to senior management, executives, and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management and stakeholders. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Management advisory skills, such as strategic thinking, problem-solving, business acumen, stakeholder management, and change management. - Strong knowledge of master data management concepts, data governance, data technology, data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels. - Team player, result-oriented, structured, attention to detail, drive for accuracy, and strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting as well as presenting. - Strong executional skills to make things happen, not generate ideas alone but also getting things done of value for the entire organization. - Proven experience in working with analytics tools as well as data ingestion and platforms like Power BI, Azure Data Lake, Snowflake, etc. - Experience in working in any MDM solution and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required. Join us at the ASSA ABLOY Group, where our innovations make spaces physical and virtual safer, more secure, and easier to access. As an employer, we value results and empower our people to build their career around their aspirations and our ambitions. We foster diverse, inclusive teams and welcome different perspectives and experiences.,

Posted 1 month ago

Apply

15.0 - 25.0 years

20 - 27 Lacs

Noida, Chandigarh, Hyderabad

Work from Office

Total experience of 15 years with at least 5 years experience in solution architecture. Should have designed and implemented enterprise level high-performance, secure, microservices-based systems on .NET/Java/Python. Experienced in defining architecture frameworks APIs, integrations, cloud services, data pipelines aligned with business goals. Hands-on experience with cloud platforms such as Azure, AWS, or GCP, including containerization. technologies like Docker and Kubernetes. Strong understanding of data architecture, ETL processes, and analytics platforms. Should be able to architect cloud-native solutions using AWS, Azure, GCP leveraging containers (Docker/Kubernetes) and infrastructure-as-code (Terraform, ARM). Lead and mentor cross-functional teams, fostering a culture of innovation and continuous improvement. Stay current with emerging tech AI/ML, RPA, generative AI, vector search and advise on adaptation. Collaborate with clients to understand their business needs and translate them into technical solutions that drive value. Experience in Agile software development methodologies. Soft Skills: Excellent problem-solving abilities, communication skills, and a proactive approach to stakeholder management. Certification: Relevant certifications such as Azure Solutions Architect, AWS Certified Solutions Architect, or TOGAF.

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As the Senior Director - Enterprise Head of Architecture at AstraZeneca, you will play a crucial role in shaping the architecture landscape across the Enterprise Capabilities & Solutions (ECS) domain. Your responsibilities will include collaborating with other Heads of Architecture to align ECS architectures with AstraZeneca's business and IT strategies. You will be instrumental in developing and implementing AZ standards, patterns, and roadmaps to ensure architectural conformity within the ECS landscape. Additionally, you will work closely with key business teams, IT Leadership, and other Heads of Architecture to drive AstraZeneca's Digital and Architectural vision in line with business strategies. Your role will involve providing Enterprise Design thinking and support across the Enterprise, application, and infrastructure domains of architecture. You will define the architecture strategy, direction, and standards specific to the ECS Segment, with a focus on Data & Analytics architecture capabilities and strategy. Leading a team of Architects, you will oversee the end-to-end delivery of architecture solutions, develop function-specific reference architectures, and manage relationships with Data & Analytics leadership to influence the adoption of enterprise architecture frameworks. As a strategic advisor and authority within the architecture community, you will refine architecture strategies, standards, and patterns as needed, ensuring continuous alignment with business priorities. You will lead the development of Architecture Roadmaps and Blueprints, contribute to multi-functional decision-making bodies, and act as a sign-off authority for solution architectures and roadmaps. Championing ECS EA initiatives, you will engage with external architecture authorities, drive the adoption of new technology, and ensure regulatory compliance while fostering team engagement. To excel in this role, you should possess a Bachelor's degree in Computer Science or a related field, along with extensive experience in ECS solutions and a blend of data architecture, analysis, and engineering skills. Knowledge of cloud-based containerization strategies, data modeling technologies, and industry architectural patterns is essential. Desirable skills include a post-graduate degree in MIS, experience in Agile data definition scrums, and familiarity with metadata cataloguing tools and Cloud Economics. If you are ready to make a difference and contribute to redefining the development of life-changing medicines while embracing digital technology and data solutions, join us at AstraZeneca in our unique and daring world. Apply now to be a part of our journey towards becoming a digital and data-led enterprise.,

Posted 1 month ago

Apply

10.0 - 15.0 years

15 - 19 Lacs

Pune

Work from Office

At Solidatus , we re changing how organisations understand their data. We re an award-winning, venture-backed software company often called the Git for Metadata. Our platform helps businesses harvest, model, and visualise complex data lineage flows. Our unique lineage-first approach, combined with active AI development, provides organisations with unparalleled clarity and robust control over their data s journey and meaning. As a growing B2B SaaS business ( great, collaborative culture. Join us as we expand globally and define the future of data understanding! Role Overview We are seeking an experienced Data Architect to lead the design and implementation of data lineage solution s that align with our clients business objectives . This role involves collaborating with cross-functional teams to ensure the integrity, accuracy and timeliness of their data lineage solution . You will be working directly with our clients helping them get the maximum value from our product and ensuring they achieve their contractual goals. Key Responsibilities Design and implement robust data lineage solutions that support business intelligence, analytics, and data governance initiatives. Collaborate with stakeholders to understand data lineage requirements and translate them into technical and business solutions . Develop and maintain lineage data models, semantic metadata systems, and data dictionaries. Ensure data quality, security, and compliance with relevant regulations. Understand Solidatus implementation and data lineage modelling best practice and ensure that it is followed at our clients. Stay abreast of emerging technologies and industry trends to continuously improve data lineage architecture practices. Qualifications Bachelors or Masters degree in Computer Science , Information Systems, or a related field. Proven experience in data architecture, with a focus on large-scale data systems with more than one company. Proficiency in data modelling , database design, and data warehousing concepts. Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark). Strong understanding of data governance, data quality, and data security principles. Excellent communication and interpersonal skills, with the ability to work effectively in a collaborative environmen t Why Join Solidatus Be part of an innovative company shaping the future of data management. Collaborate with a dynamic and talented team in a supportive work environment. Opportunities for professional growth and career advancement. Flexible working arrangements, including hybrid work options. Competitive compensation and benefits package.

Posted 1 month ago

Apply

3.0 - 6.0 years

16 - 18 Lacs

Pune

Work from Office

In this role, you will be part of product development team to manage & deliver new product functionalities, modify existing product s functionalities or improve product functionalities as required. The Developer will work with Product manager & Engineering Manager with minimum technical guidance in the Software development team lead for the design, development and test of software programs for various cloud ecosystem. You will work within a multi-disciplined engineering team consisting of electronics engineers, mechanical engineers, firmware engineers, software engineers, programmers and scientists focusing on applied research and new technology innovations to provide new and improved products and IOT solutions for our customer in Building Management System domain. How you will do it Provide third-level support to branch technicians & engineers. Maintain released products & Data Pipelines. Liaise with other departments including Product Support, Technical Authors & SQA Design software code, technical specifications & feasibility study. Participate in Analysis, code & unit testing. Identify, analyze, and resolve complex cloud IOT inadequacies. Review and provide feedback on product functional specifications. Participate in assisting Compliance, Approvals, factory testing with any necessary support. Participate in product development meetings, design reviews and code reviews. Prepare the documentation as per ISO QMS guidelines & Participate in Quality Management System reviews Makes recommendations for changes to product development guidelines & standards. Comply with established development guidelines and standards. Develop an in-depth understanding of the development realm through interaction with other groups, communication with external experts and suppliers and independent research. Work for estimation, design, analysis, coding, and unit testing. What we look for 3 - 6 years of relevant Data pipelines design, development, and testing experience. Product development experience preferred. Working knowledge on building automation and industrial automation systems will be added advantage. Skills Experience 3+ years in Big Data development. Technologies Proficiency with Snowflake, Postgres, Apache Spark, KSQL, OpenTable Formats and Flink. Data Retention Knowledge of hot and cold storage solutions. Building Management Systems Experience in integrating data-driven insights. Collaboration Ability to work with cross-functional teams. Data Governance Strong understanding of data governance practices. Responsibilities Designing Data Management Frameworks Develop and implement data strategies, create data models, and manage data warehouses. Ensuring Data Security and Compliance Implement access controls, encryption, and other security measures. Implementing Data Management Processes Oversee data systems health, define KPIs, and recommend system enhancements. Building Data Models and Strategies Construct data models and devise strategies for data management. Collaborating Across Teams Work with stakeholders to ensure data architecture meets organizational needs. Research and Development Stay updated on data management trends and explore new tools.

Posted 1 month ago

Apply

2.0 - 7.0 years

6 - 11 Lacs

Noida, Mohali, Bengaluru

Work from Office

Database Engineer Mohali, Noida, Bangalore/ India 2+ years experience 1 Position Job Information: Work Experience: 2+ years Industry: IT Services Job Type: FULL TIME Location: Mohali, Noida, Bangalore / India Role Overview: We are seeking a proactive and eager-to-learn NoSQL DBA and Data Warehouse Specialist with a focus on the Amazon Web Services (AWS) ecosystem. In this role, you will support the management of cloud-native databases and assist in building and maintaining the customer s data warehouse infrastructure on AWS. This position offers a hands-on opportunity to grow in the fields of cloud database administration and modern data architecture. Key Responsibilities: AWS NoSQL Database: Assist in maintaining AWS-managed NoSQL databases such as Amazon DynamoDB, Amazon ElastiCache (Redis/Memcached), and Amazon DocumentDB. Monitor database metrics using Amazon CloudWatch and basic performance tools. Help manage database backup, recovery, and security configurations using AWS Backup and IAM roles/policies. Collaborate with DevOps teams to support database deployments via AWS CloudFormation or Terraform. AWS Data Warehouse: Support the implementation and maintenance of data warehouse solutions using Redshift and Databricks. Help build and monitor ETL/ELT pipelines using AWS Glue, Amazon S3, Lambda, and Step Functions. Assist in data loading, cleaning, validation, and documentation. Work with BI and analytics teams to provide clean, query-ready datasets for reporting. Requirements: Technical Skills: Bachelor s degree in Computer Science, Data Engineering, or a related field (or equivalent experience). Familiarity with AWS services, especially DynamoDB, S3, Lambda, and Redshift. Basic understanding of SQL and NoSQL database concepts. Exposure to scripting languages such as Python or Bash. Understanding of core AWS concepts: IAM, VPC, CloudWatch, and S3. Soft Skills: Strong attention to detail and a passion for learning cloud technologies. Good problem-solving and troubleshooting skills. Ability to take direction, collaborate, and communicate effectively. Comfortable working in a fast-paced, team-oriented environment. Preferred Skills : AWS Certified Cloud Practitioner or working towards AWS Developer/Database certifications. Experience with AWS Glue or Step Functions. Internship or academic project using AWS services or cloud databases. Exposure to data visualization tools like QuickSight, Tableau, or Power BI. Interview Process Internal Assessment Technical Round 1 Technical Round 2

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Thiruvananthapuram

Work from Office

Collaborate with business stakeholders to gather and translate data requirements into analytical solutions. Analyze large and complex datasets to identify trends, patterns, and actionable insights. Design, develop, and maintain interactive dashboards and reports using Elasticsearch/Kibana or Power BI. Conduct ad-hoc analyses and deliver data-driven narratives to support business decision-making. Ensure data accuracy, consistency, and integrity through rigorous validation and quality checks. Write and optimize SQL queries, views, and data models for reporting and analysis. Present findings through compelling visualizations, presentations, and written summaries. Work closely with data engineers and architects to enhance data pipelines and infrastructure. Contribute to the development and standardization of KPIs, metrics, and data governance practices Required Skills (Technical Competency): Bachelor or master degree in data science, Computer Science, Statistics, or a related field. 5+ years of experience in a data analyst or business intelligence role. Proficiency in SQL and data visualization tools such as Power BI, Kibana, or similar. Proficiency in Python, Excel and data storytelling. Understanding of data modelling, ETL concepts, and basic data architecture. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management skills To adhere to the Information Security Management policies and procedures. Desired Skills: Elasticsearch/Kibana, Power BI, AWS, Python, SQL, Data modelling, Data analysis, Data quality checks, Data validation, Data visualization, Stakeholder communication, Excel, Data storytelling, Team collaboration, Problem-solving, Analytical thinking, Presentation skills, ETL concepts.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies