Home
Jobs

7 Systems Architecture Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

12 - 17 Lacs

Jaipur

Work from Office

Naukri logo

A Technical Solution Architect is responsible for designing and overseeing the implementation of technology solutions that meet the specific needs of a business or project. They collaborate with stakeholders, including clients, development teams, and other architects, to assess requirements, recommend appropriate technologies, and ensure that the architecture is scalable, secure, and cost-effective. Key responsibilities typically include : - Analyzing business requirements and translating them into technical specifications. - Designing system architecture and ensuring it aligns with business goals. - Evaluating and recommending suitable technologies and tools. - Overseeing the implementation of solutions and ensuring quality and consistency. - Addressing any technical issues that arise during development and implementation. - Providing technical guidance and mentoring to development teams. - Ensuring the solution is secure, reliable, and scalable Skills and Qualifications : - Technical Expertise : Strong experience in system design, cloud computing, infrastructure, and software architecture. - Problem Solving : Ability to analyze complex problems and design practical, scalable solutions. - Communication : Strong verbal and written communication skills to effectively collaborate with business stakeholders and technical teams. - Leadership : Ability to guide teams and influence decisions without direct authority. - Programing languages : Experience working with python, JavaScript etc. - Tools and Technologies : Familiarity with various development frameworks like Nodejs, ReactJs, Angular, Flask, Django etct, cloud platforms (AWS, Azure), DevOps, databases, and programming languages (Java, Python, etc.). - Certifications (Optional but Preferred) : AWS Certified Solutions Architect, TOGAF, or similar industry-standard certifications. Education and Experience : - 8+ years of experience in software development or systems architecture. - Proven experience designing and implementing complex technical solutions. - Experience working with cross-functional teams and stakeholders.

Posted 1 day ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper management: business and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 2 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. Responsibilities Be a founding member of the data engineering team. Help to attract talent to the team by networking with your peers, by representing PepsiCo HBS at conferences and other events, and by discussing our values and best practices when interviewing candidates. Own data pipeline development end-to-end, spanning data modeling, testing, scalability, operability and ongoing metrics. Ensure that we build high quality software by reviewing peer code check-ins. Define best practices for product development, engineering, and coding as part of a world class engineering team. Collaborate in architecture discussions and architectural decision making that is part of continually improving and expanding these platforms. Lead feature development in collaboration with other engineers; validate requirements / stories, assess current system capabilities, and decompose feature requirements into engineering tasks. Focus on delivering high quality data pipelines and tools through careful analysis of system capabilities and feature requests, peer reviews, test automation, and collaboration with other engineers. Develop software in short iterations to quickly add business value. Introduce new tools / practices to improve data and code quality; this includes researching / sourcing 3rd party tools and libraries, as well as developing tools in-house to improve workflow and quality for all data engineers. Support data pipelines developed by your teamthrough good exception handling, monitoring, and when needed by debugging production issues. Qualifications 6-9 years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience in SQL optimization and performance tuning Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with data profiling and data quality tools like Apache Griffin, Deequ, or Great Expectations. Current skills in following technologies: Python Orchestration platforms: Airflow, Luigi, Databricks, or similar Relational databases: Postgres, MySQL, or equivalents MPP data systems: Snowflake, Redshift, Synapse, or similar Cloud platforms: AWS, Azure, or similar Version control (e.g., GitHub) and familiarity with deployment, CI/CD tools. Fluent with Agile processes and tools such as Jira or Pivotal Tracker Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus.

Posted 2 weeks ago

Apply

15.0 - 17.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Required Qualifications: Bachelors or Masters degree in Electrical Engineering, Computer Engineering, or related field with 15 years or more relevant experience. Experience with use-case analysis and decomposition. Experience in Linux, Zephyr, Free RTOS or similar operating systems. Strong understanding of microprocessor and micro controller architectures, including CPU cores , DSP, memory management, and peripheral integration. Experience with system level performance optimization, low-power design, SW/HW co-design , and real-time processing. Familiarity with high-speed interconnects, memory architectures, DDR, PCIe, and bus protocols. Strong collaboration skills to work across multidisciplinary teams, including silicon, software, hardware, board design, and validation engineers. Experience in product development processes. Preferred Qualifications: Experience with ARM Cortex and/or RISC-V architecture. Experience with media processing, vision and imaging applications. Experience with system-level simulation tools, hardware/software co-design, and debugging techniques. Familiarity with Machine Learning Hardware IPs, tools, and architecture. Knowledge of functional safety and security standards. Familiarity with Wi-Fi integration, networking protocols, and secure wireless communication.

Posted 3 weeks ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

The Pega Senior Systems Architect (SSA) will be an integral member of our newly-formed Pega Delivery Team within the Trips Business Unit (TBU). The SSA will play a pivotal, hands-on role in the design, development, and implementation of robust Pega solutions tailored to meet our business objectives - reporting to an Engineering Manager. With 3-5 years of experience and requisite certification, the SSA will take on daily responsibilities to deliver high-quality technical Pega solutions for the TBU. They will work as part of an Engineering Team, adhering to Pega 'Best Practices' and established 'Ways of Working'. Candidates with recent experience in Pega 8.x and a thorough understanding of its architecture and design principles will be preferred. Familiarity with the Pega Customer Service Framework will be an added advantage. The SSA should be adept at navigating complex system requirements and translating them into efficient, scalable, and maintainable Pega applications. Their responsibilities will span the entire software development lifecycle, from initial requirement gathering to deployment and post-implementation support. Collaborating closely with both business stakeholders and the Pega Delivery Team, the SSA will ensure that crafted solutions align with enterprise standards while addressing specific business challenges. A dedication to best practices, continuous improvement, and mentoring junior team members is vital, as they will be regarded as a pillar of expertise and guidance within the team. They will be expected to guide, coach and mentor junior members in their team. In this dynamic role, the Pega SSA will also be tasked with staying updated on the latest Pega developments, innovations, and industry trends. Their proactive approach will catalyze the team's innovation, ensuring that our Pega solutions not only fulfill current demands but are also geared for future growth and adaptability. Join us in shaping the future of our Pega initiatives and fortifying our leadership in Pega solutions. Responsibility Maintain advanced knowledge of the Pega architecture and all Pega design and implementation practices. Maintain advanced knowledge about the latest features and tools provided by the Pega platform. Continuously enhance skill set and knowledge to leverage latest trends, techniques and practices and drive proactive change. Provide technical advice and guidance to team members while working with various (senior) stakeholders to solve complex problems and business requirements. Advocate for the adoption of a reusable components strategy in terms of enterprise solutions seeking to determine and develop repeatable, efficient and optimal ways of implementing Pega. Act as a mentor to the more junior SA team members, influencing them in design and developing best practices. Mentor and monitor the practices applied and quality of team deliverables. Can demonstrate innate troubleshooting skills when faced with complex problems, working systematically to resolve critical incidents in collaboration with the team in a timely fashion. Understand, value and consistently provide timely, specific and constructive feedback to team members and stakeholders regarding Pega practices and solutions. Enable the growth of Pega Citizen Development by providing internal training, upskilling and guidance. Lead simplistic designs that leverage OotB components and low-code options. Work with the internal Pega PGC (a COE function) to constantly define and refine Pega best practices and provide leadership to the team in the adoption of prescriptive development processes. Consistently design optimal solutions without unnecessary complexity and define a service oriented architecture. Work closely with the Product team and provide feedback regarding product enhancements needed to support the BUs or organizations needs. Work closely with the internal IT, Engineering and Architecture teams to understand hardware requirements, network landscape, server topologyand integration needs. Demonstrate leadership ability to back decisions with research and influence stakeholders accordingly. Demonstrate mindfulness of when to integrate with other products versus using Pega E2E and present the best architecture to meet desired needs. Years of relevant Job Knowledge Requirements of special knowledge/skills Expert knowledge of Pegasystems and Pega Customer Service apps in particular. In possession of valid Pega Lead System Architect certification. Degree in Computer Science, Information Technology or related field. Expertise in all phases of software development. Expertise in application servers,programming languages,relational and non-relational databases and integration techniques. Experience in Agile and Scrum methodologies. Extensive experience in coaching and mentoring more junior team members. Possesses a natural way of influencing stakeholders. Independent, good sense of responsibilities and goals. Excellent communication skills, able to influence and develop strong relationships. Ability to simplify complex problems.

Posted 1 month ago

Apply

11 - 14 years

35 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a Data Engineering Associate Manager, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Provide leadership and management to a team of data engineers, managing processes and their flow of work, vetting their designs, and mentoring them to realize their full potential. Act as a subject matter expert across different digital projects. Overseework with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications B.Tech in Computer Science, Math, Physics, or other technical fields. 11+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).

Posted 1 month ago

Apply

12 - 18 years

14 - 24 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Deputy Director - Data Engineering PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Data engineering lead role for D&Ai data modernization (MDIP) Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The can didate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLAs for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of engineering and data management experience Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies