Jobs
Interviews

1817 Data Architecture Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

11 - 15 Lacs

Kolkata

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary Responsibilities Analyses current business practices, processes, and procedures as well as identifying future business opportunities for leveraging Microsoft Azure Data & Analytics Services. Provide technical leadership and thought leadership as a senior member of the Analytics Practice in areas such as data access & ingestion, data processing, data integration, data modeling, database design & implementation, data visualization, and advanced analytics. Engage and collaborate with customers to understand business requirements/use cases and translate them into detailed technical specifications. Develop best practices including reusable code, libraries, patterns, and consumable frameworks for cloudbased data warehousing and ETL. Maintain best practice standards for the development or cloudbased data warehouse solutioning including naming standards. Designing and implementing highly performant data pipelines from multiple sources using Apache Spark and/or Azure Databricks Integrating the endtoend data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained Working with other members of the project team to support delivery of additional project components (API interfaces) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Integrate Databricks with other technologies (Ingestion tools, Visualization tools). Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Direct experience of building data pipelines using Azure Data Factory and Apache Spark (preferably in Databricks). Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Experience in designing and handson development in cloudbased analytics solutions. Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, and Azure App Service is required. Designing and building of data pipelines using API ingestion and Streaming ingestion methods. Knowledge of DevOps processes (including CI/CD) and Infrastructure as code is essential. Thorough understanding of Azure Cloud Infrastructure offerings. Strong experience in common data warehouse modeling principles including Kimball. Working knowledge of Python is desirable Experience developing security models. Databricks & Azure Big Data Architecture Certification would be plus Mandatory skill sets ADE, ADB, ADF Preferred skill sets ADE, ADB, ADF Years of experience required 48 Years Education qualification BE, B.Tech, MCA, M.Tech d Education Degrees/Field of Study required Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills ADF Business Components, ADL Assistance, Android Debug Bridge (ADB) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Travel Requirements Government Clearance Required?

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Chennai

Remote

Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Bengaluru

Work from Office

HashiCorp, an IBM Company solves development, operations, and security challenges in infrastructure so organizations can focus on business-critical tasks. We build products to give organizations a consistent way to manage their move to cloud-based IT infrastructures for running their applications. Our products enable companies large and small to mix and match AWS, Microsoft Azure, Google Cloud, and other clouds as well as on-premises environments, easing their ability to deliver new applications. At HashiCorp, we have used the Tao of HashiCorp as our guiding principles for product development and operate according to a strong set of company principles for how we interact with each other. We value top-notch collaboration and communication skills, both among internal teams and in how we interact with our users. What you’ll do (responsibilities) As a Senior Engineer, you will contribute to the development, operation, and enhancement of cloud offerings. With at least 8 years of experience in software engineering, cloud computing, and operational excellence, you will play a vital role in ensuring our managed services are reliable, scalable, and secure, meeting the sophisticated needs of our global customer base. Contribute to the architecture, development, and scaling of dataplane services, ensuring best practices in cloud service delivery. Implement robust monitoring and alerting frameworks to guarantee high availability and performance of our managed services. Work closely with product teams, platform teams, and security to align development efforts, enhance product integrations, and ensure a cohesive user experience. Leverage feedback from customers and support teams to drive product enhancements, focusing on reducing operational toil and improving service usability. Partner with the security team to fortify service security and comply with regulatory standards, maintaining HashiCorp’s reputation for trust and reliability. Stay at the forefront of cloud technologies and practices, advocating for and implementing solutions that enhance our managed service offerings. Required education Bachelor's Degree Required technical and professional expertise 8+ years of software engineering experience, with a focus on cloud services and infrastructure. Proficiency in Go, with experience in other languages (Python, Ruby etc) considered a plus. Extensive knowledge of cloud computing platforms (AWS, Azure, GCP) and experience with infrastructure as code (Terraform). Demonstrated ability in developing and managing production-grade cloud services, with a strong understanding of operational reliability and security practices. Excellent problem-solving skills, with the ability to collaborate effectively across diverse teams. Passionate about improving operational processes and delivering customer-centric solutions. Preferred technical and professional experience Familiarity with HashiCorp and IBM products and services.

Posted 1 month ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Mumbai

Work from Office

As Consultant, you are responsible to develop design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new mobile solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Configure Datastax Cassandra as per requirement of the project solution Design the database system specific to Cassandra in consultation with the data modelers, data architects and etl specialists as well as the microservices/ functional specialists. Thereby produce an effective database system in Cassandra according to the solution & client's needs and specifications. Interface with functional & data teams to ensure the integrations with other functional and data systems are working correctly and as designed. Participate in responsible or supporting roles in different tests or UAT that involve the DataStax Cassandra database. The role will also need to ensure that the Cassandra database is performing and error free. This will involve troubleshooting errors and performance issues and resolution of the same as well as plan for further database improvement. Ensure the database documentation & operation manual is up to date and usable Preferred technical and professional experience Has expertise, experience and deep knowledge in the configuration, design, troubleshooting of NoSQL server software and related products on Cloud, specifically DataStax Cassandra. Has knowledge/ experience in other NoSQl/ Cloud database. Installs, configures and upgrades RDBMS or NoSQL server software and related products on Cloud

Posted 1 month ago

Apply

10.0 - 15.0 years

7 - 12 Lacs

Bengaluru

Work from Office

As a Senior zOS System Programmer /Lead Development Engineer you will be involved in developing automation solutions to provision and manage any infrastructure across your organization. Being developer you will be leveraging capabilities of Terrafrom and Cloud offering to drive infrastructure as code capabilities for the IBM z/OS platform. You will Work closely with frontend engineers as part of a full-stack team, collaborate with Product, Design, and other cross-functional partners to deliver high-quality solutions. Maintain high standards of software quality within the team by establishing good practices and habits. Focus on growing capabilities to support an enhancing the experience of the offering. Required education Bachelor's Degree Required technical and professional expertise 10+ years of Software development experience with zOS or zOS Sub-systems. * 8+ years Professional experience developing with Golang, Python and Ruby * Hands-on experience with z/OS system programming or administration experience * Experience with Terraform key features like Infrastructure as a code, change automation, auto scaling. * Experience working with cloud provider such as AWS, Azure or GCP, with a focus on scalability, resilience and security. * Cloud-native mindset and solid understanding of DevOps principles in a cloud environment * Familiarity with cloud monitoring tools to implement robust observability practices that prioritize metrics, logging and tracing for high reliability and performance. * Extensive experience with cloud computing platforms (AWS, Azure, GCP) and infrastructure as code (Terraform). * Strong interest in customer-focused work, with experience collaborating with Design and Product Management functions to deliver impactful solutions. * Demonstrated ability to tackle complex technical challenges and deliver innovative solutions. * Excellent communication and collaboration skills, with a focus on customer satisfaction and team success. * Strong analytical, debugging and problem solving skills to analyse issues and defects reported by customer-facing and test teams. * Proficient in source control management tools (GitHub, ) and with Agile Life Cycle Management tools. * Soft Skills: Strong communication, collaboration, self-organization, self-study, and the ability to accept and respond constructively to critical feedback. Preferred technical and professional experience

Posted 1 month ago

Apply

15.0 - 20.0 years

20 - 35 Lacs

Chennai, Bengaluru, Thiruvananthapuram

Hybrid

Data Architect Exp: 15+ Yrs Location: Bengaluru, Chennai, TVM Role Overview We are seeking a highly experienced Data Architect with deep expertise in designing and managing largescale, complex data architectures, along with exposure to Machine Learning (ML) and Artificial Intelligence (AI). This role is key to driving our data strategy, integrating advanced AI/ML capabilities into data ecosystems, and supporting high-stakes proposals for global clients. The ideal candidate combines technical depth with strategic vision, bridging traditional data systems with next-generation AI-driven solutions to ensure scalability, security, and innovation Key Responsibilities • Enterprise Data Architecture: Lead the design and implementation of comprehensive data architectures for large, complex organizations, ensuring systems are robust, eGicient, and meet business needs. • End-to-End Data Solution Management: Architect and manage end-to-end data solutions, from data ingestion and processing through storage, analytics, and reporting, ensuring high performance and data integrity . • Strategic Proposal Support: Collaborate with sales and business development teams to support RFPs and client proposals, oGering data architecture expertise that drives competitive advantage and innovation. • Complex Data Projects Oversight: Oversee complex data initiatives, making architectural decisions for data modeling, ETL, data warehousing, big data, and cloud data platforms. • Technology Innovation & Best Practices: Identify, evaluate, and implement emerging data technologies to enhance architectural strategy, keeping solutions cutting-edge and cost-eGective. • Stakeholder Engagement & Collaboration: Act as a key liaison with business and technical stakeholders, translating data architecture strategies into actionable solutions that align with organizational goals and regulatory standards Technical Skills • Data Architecture & Modelling: Advanced knowledge of data modelling techniques (conceptual, logical, physical), database design, and normalization theory. Expertise in relational and NoSQL databases, especially with systems like SQL Server, Oracle, PostgreSQL, MongoDB, Cassandra, DynamoDB, or similar. Proficiency in ERD/logical data design tools such as ERwin, PowerDesigner, or DBeaver . • ETL & Data Integration: Extensive experience with ETL/ELT tools such as Informatica , Talend , Apache Nifi, or DBT (Data Build Tool) . Strong understanding of data integration platforms and real-time data processing (e.g. Apache Kafka, AWS Glue, Azure Data Factory, Databricks). • Big Data & Cloud Platforms: Experience with big data ecosystems (Hadoop, Spark, Hive) and stream processing systems (Kafka Streams , Flume , etc.). Strong understanding and hands-on experience with cloud platforms (AWS , Google Cloud , Azure ) and their database services (e.g., Redshift , BigQuery , Snowflake , or Azure Synapse ). Data Warehousing & BI Tools: Experience building and maintaining data warehouses, data marts, and OLAP cubes. Knowledge of Business Intelligence tools such as Tableau, Power BI, Looker, or QlikView. Familiarity with data lake architectures and cataloging frameworks for hybrid data storage solutions (e.g. S3, ADLS, Delta Lake) Security & Compliance: Strong understanding of data security , encryption, tokenization, and access control best practices. Familiarity with governance frameworks and data compliance regulations (e.g., GDPR, CCPA, HIPAA, and SOX) to implement secure architecture and standardized procedures • Programming & Scripting: Strong scripting and programming skills in Python, Java, Scala, or similar languages. Proficiency in SQL for querying databases and performing advanced data analysis Qualifications • Experience: 1. 15+ years in data architecture with a strong emphasis on large-scale, complex enterprise data solutions. 2. Significant experience in supporting sales and proposal development with a focus on high-value, strategic engagements. • Education: Bachelors or Master’s degree in Computer Science, Data Science, Information Technology, or a related field. Advanced certifications (e.g., AWS Certified Big Data Specialty, Azure Solutions Architect Expert) are highly desirable. • Certifications: 1. Certifications in cloud platforms like AWS Certified Big Data Architect , Microsoft Azure Data Architect , or similar are a plus 2. TOGAF, CDMP (Certified Data Management Professional), or other relevant data architecture/management certifications are added advantages

Posted 1 month ago

Apply

8.0 - 10.0 years

5 - 15 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Job Title: Azure Data Architect Experience: 8 to 10 years Location: Pan India Employment Type: Full-Time Notice period : Immediate to 30 days Technology: SQL, ADF, ADLS, Synapse, Pyspark, Databricks, data modelling Key Responsibilities: Requirement gathering and analysis Design of data architecture and data model to ingest data Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Hands on experience on Azure functions and other components like realtime streaming etc Oversee Azure billing processes, conducting analyses to ensure cost-effectiveness and efficiency in data operations. Provide optimized solution for any problem related to data engineering Ability to work with verity of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables

Posted 1 month ago

Apply

7.0 - 12.0 years

22 - 37 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Mumbai & Bengaluru - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 710 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Location: Bangalore/Hyderabad/Pune Experience level: 7+ Years About the Role We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field. 8+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.

Posted 1 month ago

Apply

15.0 - 20.0 years

20 - 30 Lacs

Noida, Gurugram

Hybrid

Design architectures using Microsoft SQL Server MongoDB Develop ETLdata lakes, Integrate reporting tools like Power BI, Qlik, and Crystal Reports to data strategy Implement AWS cloud services,PaaS,SaaS, IaaS,SQL and NoSQL databases,data integration

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Noida

Work from Office

As a Sr. Data Engineer, you will lead the design, development, and management across investments to platform and data pipelines. You will identify, monitor, and lead initiatives to ensure our data platform remains scalable, reliable, and efficient in light of the evolving data requirements of our products and services. You will work closely with solution experts to design, iterate, and develop key pipelines to unlock new solution functionality, analytical insights, and machine-learning features. You will be adept in partnering with cross-functional partners and data users to translate needs into technical solutions and leading the technical scoping, implementation, and general execution of improvements to our solutions and platform. You will be data curious and excited to have an impact on the team and in the company and to improve the quality of healthcare operations. Key Responsibilities Spearheaded the discovery, evaluation, and integration of new datasets, collaborating (incl. pipeline development and data modeling/documentation) working closely with key data stakeholders to understand their impact and relevance to our core products and the healthcare domain. Translate product / analytical vision into highly functional data pipelines supporting high-quality & highly trusted data products (incl. designing data structures, building and scheduling data transformation pipelines, improving transparency etc). Set the standard for data engineering practices within the company, guiding the architectural approaches, data pipeline designs, and the integration of cutting-edge technologies to foster a culture of innovation and continuous improvement. Key Qualifications Excellent cross-functional communication - the ability to break down complex technical components for technical and non-technical partners alike Innate aptitude for interpreting complex datasets with demonstrated ability to discern underlying patterns, identify anomalies, and extract meaningful insights, demonstrating advanced data intuition and analytical skills. (Healthcare experience preferred) Excellence in quality data pipeline design, development, and optimization to create reliable, modular, secure data foundations for the organizations data delivery system from applications to analytics & ML Proven ability to independently handle ambiguous project requirements and lead data initiatives from start to finish, while collaborating extensively with cross-functional, non-technical teams to inform and shape product development. Nice to Have Skills 5+ years of experience designing, building, and operating cloud-based, highly available, observable, and scalable data platforms utilizing large, diverse data sets in production to meet ambiguous business needs Relevant industry certifications in a variety of Data Architecture services (SnowPro Advanced Architect, Azure Solutions Architect Expert, AWS Solutions Architect / Database, Databricks Data Engineer / Spark / Platform etc) Experience with MLOps and/or developing and maintaining machine learning models and infrastructure Experience with data visualization tools and analytics technologies (Looker, Tableau, etc) Degree in Computer Science, Engineering, or related field

Posted 1 month ago

Apply

11.0 - 14.0 years

13 - 16 Lacs

Hyderabad

Work from Office

We are looking for a skilled Senior Azure/Fabric Data Engineer with 11-14 years of experience to join our team at Apps Associates (I) Pvt. Ltd, located in the IT Services & Consulting industry. Roles and Responsibility Design and implement scalable data pipelines using Azure and Fabric. Develop and maintain large-scale data architectures to support business intelligence and analytics. Collaborate with cross-functional teams to identify and prioritize project requirements. Ensure data quality and integrity by implementing robust testing and validation procedures. Optimize data storage and retrieval processes for improved performance and efficiency. Provide technical guidance and mentorship to junior team members. Job Strong understanding of data engineering principles and practices. Experience with Azure and Fabric technologies is highly desirable. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a fast-paced environment. Strong communication and interpersonal skills. Experience with agile development methodologies is preferred.

Posted 1 month ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Responsibilities: Design and implement scalable data pipelines to ingest, process, and analyze large volumes of structured and unstructured data from various sources. Develop and optimize data storage solutions, including data warehouses, data lakes, and NoSQL databases, to support efficient data retrieval and analysis. Implement data processing frameworks and tools such as Apache Hadoop, Spark, Kafka, and Flink to enable real-time and batch data processing. Collaborate with data scientists and analysts to understand data requirements and develop solutions that enable advanced analytics, machine learning, and reporting. Ensure data quality, integrity, and security by implementing best practices for data governance, metadata management, and data lineage. Monitor and troubleshoot data pipelines and infrastructure to ensure reliability, performance, and scalability. Develop and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources and transform it into usable formats. Stay current with emerging technologies and trends in big data and cloud computing, and evaluate their applicability to enhance our data engineering capabilities. Document data architectures, pipelines, and processes to ensure clear communication and knowledge sharing across the team. Strong programming skills in Java, Python, or Scala.Strong understanding of data modelling, data warehousing, and ETL processes. Min 4 to Max 6yrs of Relevant exp.Strong understanding of Big Data technologies and their architectures, including Hadoop, Spark, and NoSQL databases. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Microland Limited is looking for Associate Technical Architect - Data Center to join our dynamic team and embark on a rewarding career journeyProviding technical leadership and guidance to software development teamsDesigning and developing software solutions that meet business requirements and align with enterprise architecture standardsCollaborating with project managers, product owners, and other stakeholders to understand requirements and ensure software solutions meet customer needsConducting technology research and evaluation to identify new technologies and solutions that can improve software development processesDeveloping and maintaining software architecture and design documentationProviding technical mentorship to junior software developersEnsuring that software solutions are developed with high levels of quality, performance, and securityParticipating in code reviews to ensure code quality and adherence to best practicesCommunicating project status and progress to stakeholders, including project managers and customersExcellent communication and interpersonal skillsStrong understanding of software architecture and design patterns

Posted 1 month ago

Apply

10.0 - 15.0 years

22 - 27 Lacs

Pune

Work from Office

. Job Title Advisor, Software Architecture Job Posting Title: Principal, Software Development Engineering What does a successful Enterprise Solution Architect do at Fiserv? Fiserv is looking for a Lead Enterprise IT Architect with in-depth and current technical architecture and development experiences in full stack Java, Cloud, Web and mobile technologies as well as exposure to Domain Driven Designs, Event Based Architecture, API and microservices architecture. You must be passionate about new technologies and demonstrate working knowledge in various technologies across Cloud, UI, Application, Security and Data architecture domains. You will join the Fiserv s Global Finance Service Architecture team to lead and build new cloud native applications. The ideal candidate will be required to work in a cross functional environment in defining and building end-to-end solutions. Relevant digital solutions experience in the payment solutions domain will be highly regarded. An Enterprise Architect/Solution Architect within the Global Issuer organization is laser-focused on go-to-market solution strategy dealing from conceptual views to building complete and complex solutions; RFP response activities, and the development of new solution / integrations that position Fiserv for large-scale processing environments including cloud implementation and System Integration pursuits. You will be operating at a strategic level, identifying technology solutions that meet business requirements, defining and describing those solutions and solution requirements, and providing specifications for product management as well as IT delivery. Put simply, this role is a great fit if you enjoy figuring out the best possible way of bringing together business need and technological solutions. What you will do: Thought leadership in the sales and hand-off to delivery of complex solutions encompassing multiple products and services, involving a clear strategy for product integration Influence product development senior management on enterprise-level innovation roadmap strategy Assist Product Leaders with business guidance, consultative direction, and knowledge development Solution leadership on complex, supporting RFPs requiring collaboration and input from multiple Fiserv divisions Develop design specifications, infrastructure diagrams and other system-related information. Maintain and/or obtain a detailed level of knowledge on company solutions, products and services. Reduce time to revenue by managing pre-to-post sales handoff to implementations. Implement solutions focusing on reuse and industry standards at a program, enterprise or operational scope. Engage extensively with development teams, related enterprise/software architects, business analysts, etc. Apply extensive analytical skills to address the needs of corporate strategy, understand technology specifics, understand how different parts of the business operation are connected and how business processes achieve goals. What you will need to have: 10+ years of experience in large-scale IT system development, design and implementation, involving demonstrated project management, resource management, business analysis and leadership skills. Familiar with functions of hardware, software, and network systems 5+ years of experience in, technical support, implementation, and/or product development with strong consultative and strategic sales support skill sets Strong understanding of modern data, software and cloud practices. Knowledge in mainframe operations is preferred. Exceptional communication and presentation skills, emotional intelligence, with the ability to listen, advise, empathize and explain to varied audiences, at all levels. Exceptional analytical skills and the ability to see the connections between layers of business operations Bachelor s degree What would be great to have: Some working experience working with any other payment solutions including mainframe environment would be of advantage. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Share this Job Email LinkedIn X Facebook

Posted 1 month ago

Apply

6.0 - 11.0 years

4 - 5 Lacs

Bengaluru

Work from Office

Req ID: 329629 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a C#.NET GCP Cloud developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Senior .NET/GCP Engineer - Remote How You ll Help Us: A Senior Application Developer is first and foremost a software developer who specializes in .NET C# development. You ll be part of a team focused on delivering quality software for our clients. How We Will Help You: Joining our Microsoft practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. Additionally, you will collaborate with teams and support emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge / support for applications development, integration, and maintenance as well as providing input to department and project teams on decisions supporting projects. Basic Qualifications: 6+ years developing in .Net/.Net Core 3+ years of experience with Object Oriented Programming and SOLID Principles 3+ years of Rest API development . 2+ years of hands-on experience in GCP i.e. pub sub, cloud functions etc. 2+ years of experience working with Databases and writing stored procedures 2+ year of unit and service testing with frameworks such as xunit, Nunit, etc. 1+ year of cloud platform experience either in AWS, Azure, or GCP Preferred: Experience with CI/CD tooling i.e. Jenkins, Azure Devops etc Experience with containerization technologies e.g. Docker, Kubernetes Ideal Mindset: Lifelong Learner: You are always seeking to improve your technical and nontechnical skills. Team Player: You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator: You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. consider their working time as 12noon to 10pm IST

Posted 1 month ago

Apply

3.0 - 8.0 years

8 - 9 Lacs

Bengaluru

Work from Office

Are you passionate about data and code? Does the prospect of dealing with mission-critical data excite you? Do you want to build data engineering solutions that process a broad range of business and customer data? Do you want to continuously improve the systems that enable annual worldwide revenue of hundreds of billions of dollars? If so, then the eCommerce Services (eCS) team is for you! In eCommerce Services (eCS), we build systems that span the full range of eCommerce functionality, from Privacy, Identity, Purchase Experience and Ordering to Shipping, Tax and Financial integration. eCommerce Services manages several aspects of the customer life cycle, starting from account creation and sign in, to placing items in the shopping cart, proceeding through checkout, order processing, managing order history and post-fulfillment actions such as refunds and tax invoices. eCS services determine sales tax and shipping charges, and we ensure the privacy of our customers. Our mission is to provide a commerce foundation that accelerates business innovation and delivers a secure, available, performant, and reliable shopping experience to Amazon s customers. The goal of the eCS Data Engineering and Analytics team is to provide high quality, on-time reports to Amazon business teams, enabling them to expand globally at scale. Our team has a direct impact on retail CX, a key component that runs our Amazon fly wheel. As a Data Engineer, you will own the architecture of DW solutions for the Enterprise using multiple platforms. You would have the opportunity to lead the design, creation and management of extremely large datasets working backwards from business use cases. You will use your strong business and communication skills to be able to work with business analysts and engineers to determine how best to design the data warehouse for reporting and analytics. You will be responsible for designing and implementing scalable ETL processes in the data warehouse platform to support the rapidly growing and dynamic business demand for data and use it to deliver the data as service which will have an immediate influence on day-to-day decision making. Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (DataNet, Cradle, Quick Sight etc. Improve existing solutions and come up with next generation Data Architecture to improve scale, quality, timeliness, coverage, monitoring and security. Develop new data models and end to data pipelines. Create and implement Data Governance strategy for mitigating privacy and security risks. 3+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with SQL Bachelors degree Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)

Posted 1 month ago

Apply

12.0 - 14.0 years

25 - 30 Lacs

Chennai

Work from Office

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 1 month ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Overview Azure Data Architect Bangalore Aptean is changing. Our ERP solutions are transforming a huge range of global businesses, from food producers to manufacturers. In a world of generic enterprise software, we provide targeted solutions that bring together the very best technology and drive greater results. With over 4500 employees, 90 different products and a global client base, there s no better time to advance your career at Aptean. Are you ready for what s next, now? We are! If being part of a dynamic, high growth organization excites you and you are a Senior Data Architect, and eager to learn and grow, then this opportunity is for you! Our fast-paced environment and dynamic global R&D department is eager for a mover and shaker to step into this role and become an integral part of our team. Job Summary: We are looking for a seasoned Data Architect with deep expertise in Spark to lead the design and implementation of modern data processing solutions. The ideal candidate will have extensive experience in distributed data processing, large-scale data pipelines, and cloud-native data platforms. This is a strategic role focused on building scalable, fault-tolerant, and high-performance data systems. Key Responsibilities: Architect, design, and implement large-scale data pipelines using Spark (batch and streaming). Optimize Spark jobs for performance, cost-efficiency, and scalability. Define and implement enterprise data architecture standards and best practices. Guide the transition from traditional ETL platforms to Spark-based solutions. Lead the integration of Spark-based pipelines into cloud platforms (Azure Fabric/Spark pools). Establish and enforce data architecture standards, including governance, lineage, and quality. Mentor data engineering teams on best practices with Spark (e.g., partitioning, caching, join strategies). Implement and manage CI/CD pipelines for Spark workloads using tools like GIT or DevOps. Ensure robust monitoring, alerting, and logging for Spark applications. Required Skills & Qualifications: 10+ years of experience in data engineering, with 7+ years of hands-on experience with Apache Spark (PySpark/Scala). Proficiency in Spark optimization techniques, Monitoring, Caching, advanced SQL, and distributed data design. Experience with Spark on Databricks and Azure Fabric. Solid understanding of Delta Lake, Spark Structured Streaming, and data pipelines. Strong experience in cloud platforms ( Azure). Proven ability to handle large-scale datasets (terabytes to petabytes). Familiarity with data lakehouse architectures, schema evolution, and data governance. Candidate to be experienced in Power BI, with at least 3+ years of experience. Preferred Qualifications: Experience implementing real-time analytics using Spark Streaming or Structured Streaming. Certifications in Databricks, Fabric or Spark would be a plus. If you share our mindset, you can share in our success. To find out more about joining Aptean, get in touch today. Learn from our differences. Celebrate our diversity. Grow and succeed together. Aptean pledges to promote a company culture where diversity, equity and inclusion are central. We are committed to applying this principle as we interact with our customers, build our teams, cultivate our leaders and shape a company in which any employee can succeed, regardless of race, color, sex, national origin, sexuality and gender identity, religion, disability, age, status as a protected veteran or any other group status protected by law. Celebrating our diverse experiences, opinions and beliefs allows us to embrace what makes us unique and to use this as an asset in bringing innovative solutions to our customer base. At Aptean, our global and diverse employee base is our greatest asset. It is through embracing and understanding our differences that we are able to harness our individual power to maximize the success of our customers, our employees and our company. - TVN Reddy

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mumbai

Work from Office

Payments Operations and the Payments industry are undergoing significant amounts of change and disruption - industry, technology, and organizational. It is critical to develop and execute on a strategy that will enhance the organization s business and operational model and position it for continued success. As a Data Domain Architect within the Strategy, Innovation & Governance Data team, you will be instrumental in developing the data architecture and strategy for Payments Operations. You will utilize your technical expertise to gather and prepare data from diverse platforms, work alongside data analytics personnel to enhance solutions, and collaborate closely with Technology, Product, and Corporate and Investment Banking data partners to execute use cases. This role provides the chance to propel innovation and insights throughout the organization, playing a key role in shaping the future state data architecture and roadmap. Job responsibilities Understand the data landscape across the Payments organization and work to leverage all available data resources. Leverage technical skills to source and prepare data from a variety of data sources including traditional databases, no-SQL, Hadoop, and Cloud. Work closely with data analytics staff within our team to understand requirements and partner to optimize solutions and develop/foster new ideas. Work with Data Domain Architect lead on all facets of Data Domain Architecture, including resource management. Work with Tech, Product, and CIB data partners to research and implement use cases. Evaluate the current data architecture and shape the future state data architecture and roadmap to serve on Payments Operations data needs. Required qualifications, capabilities and skills Minimum 8+ years of relevant work experience as a software developer, data/ML engineer, data scientist, business intelligence engineer Minimum of Bachelor s degree in Computer Science/Financial Engineering, MIS, Mathematics, Statistics or other quantitative subject Analytical thinking and problem-solving skills coupled with ability to understand business requirements and to communicate complex information effectively to broad audiences Ability to collaborate across teams and at varying levels using a consultative approach General understanding of Agile Methodology Cloud platform knowledge; hands on experience with Databricks or Snowflake Traditional Database Skills (Oracle, SQL Server). Strong SQL overall. Experience with Python/PySpark Understanding of ETL framework and tools including Alteryx Fundamental understanding of Data Architecture. Ability to profile, clean, and extract data from a variety of sources Analytics and insights development experience - telling stories with data using Tableau / Alteryx Exposure to data science, AI/ML and model development Payments Operations and the Payments industry are undergoing significant amounts of change and disruption - industry, technology, and organizational. It is critical to develop and execute on a strategy that will enhance the organization s business and operational model and position it for continued success. As a Data Domain Architect within the Strategy, Innovation & Governance Data team, you will be instrumental in developing the data architecture and strategy for Payments Operations. You will utilize your technical expertise to gather and prepare data from diverse platforms, work alongside data analytics personnel to enhance solutions, and collaborate closely with Technology, Product, and Corporate and Investment Banking data partners to execute use cases. This role provides the chance to propel innovation and insights throughout the organization, playing a key role in shaping the future state data architecture and roadmap. Job responsibilities Understand the data landscape across the Payments organization and work to leverage all available data resources. Leverage technical skills to source and prepare data from a variety of data sources including traditional databases, no-SQL, Hadoop, and Cloud. Work closely with data analytics staff within our team to understand requirements and partner to optimize solutions and develop/foster new ideas. Work with Data Domain Architect lead on all facets of Data Domain Architecture, including resource management. Work with Tech, Product, and CIB data partners to research and implement use cases. Evaluate the current data architecture and shape the future state data architecture and roadmap to serve on Payments Operations data needs. Required qualifications, capabilities and skills Minimum 8+ years of relevant work experience as a software developer, data/ML engineer, data scientist, business intelligence engineer Minimum of Bachelor s degree in Computer Science/Financial Engineering, MIS, Mathematics, Statistics or other quantitative subject Analytical thinking and problem-solving skills coupled with ability to understand business requirements and to communicate complex information effectively to broad audiences Ability to collaborate across teams and at varying levels using a consultative approach General understanding of Agile Methodology Cloud platform knowledge; hands on experience with Databricks or Snowflake Traditional Database Skills (Oracle, SQL Server). Strong SQL overall. Experience with Python/PySpark Understanding of ETL framework and tools including Alteryx Fundamental understanding of Data Architecture. Ability to profile, clean, and extract data from a variety of sources Analytics and insights development experience - telling stories with data using Tableau / Alteryx Exposure to data science, AI/ML and model development

Posted 1 month ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Hyderabad

Work from Office

Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.

Posted 1 month ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Mohali

Work from Office

We are looking for a highly skilled and experienced Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical skills and attention to detail. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data analysis systems and reports. Provide expert-level support for data analysis and reporting needs. Identify trends and patterns in large datasets to inform business decisions. Develop and implement process improvements to increase efficiency and productivity. Communicate findings and insights to stakeholders through clear and concise reports. Job Requirements Strong understanding of data analysis principles and techniques. Proficiency in data visualization tools and software. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment with multiple priorities. Strong problem-solving skills and attention to detail. Experience working with large datasets and developing complex reports. Title: Analyst, ref: 78642.

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Deliver NoPerformance ParameterMeasure1Operations of the towerSLA adherence Knowledge management CSAT/ Customer Experience Identification of risk issues and mitigation plans Knowledge management2New projectsTimely delivery Avoid unauthorised changes No formal escalations Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform. Experience5-8 Years.

Posted 1 month ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

Bengaluru

Work from Office

Job Title: Analyst Intern Storefront Team Location: Bangalore, India Duration: 3-6 months Team: Storefront (Product & Experience) About Udaan 2.0 Udaan is Myntras initiative specifically designed to offer a career launchpad to people with disabilities. It is a six month paid internship that ensures a conducive environment facilitating a smooth transition to work. With structured on-boarding, customized learning and development programs, mentorship opportunities, on the job learning and best in class benefits, we aim to provide an environment that is supportive, so that you can thrive and build your career with us. As a part of our commitment towards diversity and inclusion, through this program, we strive to create a culture where all can belong and bring their experiences and authentic selves to work every day. During your internship with us, you will get the opportunity to work with the best talent in the e-commerce industry and work on projects that match your interest, abilities and could lead to full-time employment with Myntra. About Myntra: Myntra is Indias leading fashion and lifestyle e-commerceplatform, known for delivering a personalized and engaging shopping experienceto millions. Our Storefront team plays a pivotal role in crafting the userjourney and ensuring every touchpoint on the app/website drives discovery,engagement, and conversion. Role Overview: We are looking for a data-driven and curious Analyst Internto join the Storefront Team. You will work closely with product managers,designers, engineers, and marketing teams to analyze platform data, buildperformance dashboards, derive insights, and contribute to optimizationexperiments across the customer funnel. Key Responsibilities: Analyze customer behavior across key Storefront surfaces like homepage, PLP, PDP and navigation. Create and maintain dashboards to track KPIs such as click-through rate (CTR),conversion rate, engagement time, and bounce rate. Partner with product and design teams to measure A/B test performance and interpret results. Conduct root cause analysis for performance dips or changes in user patterns. Identify growth opportunities and generate hypotheses for UX, content, or merchandising enhancements. Prepare weekly reports and business review decks for leadership consumption. Qualifications: Pursuing or recently completed a Bachelors or Masters degree in Engineering, Statistics, Mathematics, Economics, or related fields. Strong proficiency in SQL and Excel; familiarity with data visualization tools like Tableau/PowerBI preferred. Exposure to Python/R for data analysis is a plus. Excellent analytical and problem-solving skills with attention to detail. Ability to work in a fast-paced, collaborative environment.

Posted 1 month ago

Apply

8.0 - 10.0 years

8 - 10 Lacs

Chennai

Remote

Title: Senior Data Architect Years of Experience : 10+ years Location: Onsite ( The selected candidate is required to relocate to Kovilpatti/ Chennai, Tamil Nadu for the initial three-month project training session). Job Description The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns Design logical and physical data models, semantic layers, and metadata frameworks Establish data quality, lineage, governance, and security policies Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks Integrate AI and analytics solutions with operational data platforms Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift Data Modeling: ERWin, dbt, Power Designer Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark Integration: Azure Data Factory, Kafka, Event Grid, SSIS Metadata/Lineage: Purview, Collibra, Informatica BI Platforms: Power BI, Tableau, Looker Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification Bachelors or Master’s in Computer Science, Information Systems, or Data Engineering Microsoft Certified: Azure Data Engineer / Azure Solutions Architect Strong experience building cloud-native data architectures Demonstrated ability to create data blueprints aligned with business strategy and compliance.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies