Jobs
Interviews

2275 Data Architecture Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

9 - 13 Lacs

hyderabad

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing data requirements and translating them into effective solutions that align with the overall data strategy of the organization. Your role will require you to stay updated with the latest trends in data engineering and contribute to the continuous improvement of data processes and systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in the design and implementation of data pipelines to support data integration and analytics.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Databricks, Microsoft Power Business Intelligence (BI).- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data warehousing solutions.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

bengaluru

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning ServicesMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for a seasoned Senior Manager CCAI Architect with deep expertise in designing and delivering enterprise-grade Conversational AI solutions using Google Cloud's Contact Center AI (CCAI) suite. This role demands a visionary leader who can bridge the gap between business objectives and AI-driven customer experience innovations. As a senior leader, you will drive architectural decisions, guide cross-functional teams, and define the roadmap for scalable and intelligent virtual agent platforms. Roles & Responsibilities:- Own the end-to-end architecture and solution design for large-scale CCAI implementations across industries.- Define best practices, reusable frameworks, and architectural patterns using Google Dialogflow CX, Agent Assist, Knowledge Bases, and Gen AI capabilities.- Act as a strategic advisor to stakeholders on how to modernize and transform customer experience through Conversational AI.- Lead technical teams and partner with product, operations, and engineering leaders to deliver high-impact AI-first customer service platforms.- Oversee delivery governance, performance optimization, and scalability of deployed CCAI solutions.- Evaluate and integrate cutting-edge Gen AI models (LLMs, PaLM, Gemini) to enhance virtual agent performance and personalization.- Enable and mentor architects, developers, and consultants on Google Cloud AI/ML tools and CCAI strategies.- 10+ years of experience in enterprise solution architecture, with 4+ years in Google Cloud AI/ML and Conversational AI platforms.- Deep expertise in Dialogflow CX, CCAI Insights, Agent Assist, Gen AI APIs, and GCP architecture.Strong leadership in managing large transformation programs involving AI chatbots, voice bots, and omnichannel virtual agents.- Proven ability to engage with senior stakeholders, define AI strategies, and align technical delivery with business goals.- Experience integrating AI solutions with CRMs, contact center platforms (e.g., Genesys, Five9), and backend systems.- Certifications in Google Cloud (e.g., Professional Cloud Architect, Cloud AI Engineer) are a strong plus.- Exceptional communication, thought leadership, and stakeholder management skills. Professional & Technical Skills: - Must Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding.- Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms, NLP and techniques.- Experience with chatbot, generative AI models, prompt Engineering.V- Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information:- The candidate should have a minimum of 18+ years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI.- The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions.- A 15-year full time education is required Qualification 15 years full time education

Posted 2 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions, contribute to the overall project strategy, and continuously refine your skills to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and analytics.- Familiarity with programming languages such as Python or Scala.- Knowledge of data visualization techniques and tools. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

bengaluru

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.

Posted 2 days ago

Apply

13.0 - 15.0 years

14 - 19 Lacs

hyderabad

Work from Office

As a Senior Principle Engineer - Cloud Data Platform (GCP) at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team to ensure that the GCP environment is secure and complies with relevant regulations. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Google Cloud Platform (GCP) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Foster a collaborative and supportive work environment, promoting open communication and teamwork. Demonstrate strong leadership skills, with the ability to inspire and motivate team members to perform at their best. Technical Skills Skills Requirements: In-depth knowledge of GCP services and tools such as Google Cloud Storage, Google BigQuery, and Google Cloud Dataflow Experience in building scalable and reliable data pipelines using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on GCP Strong knowledge of programming languages such as Python, Java, and SQL Nice-to-have skills Qualifications 13-15 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 2 days ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

pune

Work from Office

The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Market Data Integration Support - Techno Functional Specialist LocationPune/Bengaluru Experience2 to 4 years DesignationAssociate Industry/DomainETL/Mapping Tool, VBA, SQL, Market Data Specialist, Capital Market knowledge Apex Group Ltd has a requirement for Market Data Integration Specialist. We are seeking an inquisitive and analytical thinker who will be responsible for ensuring the quality, accuracy, and consistency of pricing & reference data with recommended data providers in financial domain such as Bloomberg, Refinitiv and Markit. Role is responsible for developing approaches, logic, methodology and business requirements for validating, normalizing, integrating, transforming, and distributing data using data platforms and analytics tools. Candidate will be responsible for maintaining the integrity of organisational critical data and supporting data-driven decision-making. Candidate will be a data professional with a technical and commercial mindset, as well as an excellent communicator with strong stakeholder management skills. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Technical/ Functional Expertise Required Develop an understanding of reference and master data sets, vendor data (Bloomberg, Refinitiv, Markit) and underlying data architecture, processes, methodology and systems. Should have strong knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Develop automated frameworks to produce source and target mappings, data load and extraction process, data pre-processing, transformation, integration from various sources and data distribution. Work with business to analyse and understand business requirements and review/produce technical and business specification with focus on reference data modelling. Integrate business requirements into logical solution through qualitative and quantitative data analysis and prototyping. Strong knowledge on overall pricing and static data concepts like different investment types, pricing types, vendor hierarchy, price methodology, market value concept. Analyse complex production issues and provide solution. Produce detailed functional and technical specification documents for development and testing. Hands on experience in working on any ETL tools is mandatory . Strong command of SQL, VBA, and Advance Excel. Understanding of the funds administration industry is necessary. Intermediate knowledge of financial instruments, both listed and unlisted or OTCs which includes and not limited to derivatives, illiquid stocks, private equity, bankdebts, and swaps. Testing and troubleshooting integrations and technical configurations. Effectively multi-task, schedule and prioritize deliverables to meet the project timelines. Ensure operational guidelines are updated & adhere to standards, procedures & also identify plan to mitigate risks wherever there is a control issue. Ability to contribute towards critical projects for product enhancements and efficiency gains. Good understanding of Geneva, Paxus , or any other accounting system. Self - starter with a quick learning ability, possessing strong verbal and written communication skills, and have an ability to present effectively. Maintenance and creation of standard Operating Procedure. Proficiency in an accounting system, preferably Advent Geneva or Paxus would be added advantage. An ability to work under pressure with changing priorities. Experience and Knowledge: 3+ years of related experience in support/ technical in any accounting platform (Paxus/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Excellent communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.

Posted 2 days ago

Apply

2.0 - 4.0 years

1 - 4 Lacs

bengaluru

Work from Office

The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Market Data Integration Support - Techno Functional Specialist LocationBengaluru Experience2 to 4 years DesignationAssociate Industry/DomainETL/Mapping Tool, VBA, SQL, Market Data Specialist, Capital Market knowledge Apex Group Ltd has a requirement for Market Data Integration Specialist. We are seeking an inquisitive and analytical thinker who will be responsible for ensuring the quality, accuracy, and consistency of pricing & reference data with recommended data providers in financial domain such as Bloomberg, Refinitiv and Markit. Role is responsible for developing approaches, logic, methodology and business requirements for validating, normalizing, integrating, transforming, and distributing data using data platforms and analytics tools. Candidate will be responsible for maintaining the integrity of organisational critical data and supporting data-driven decision-making. Candidate will be a data professional with a technical and commercial mindset, as well as an excellent communicator with strong stakeholder management skills. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Technical/ Functional Expertise Required Develop an understanding of reference and master data sets, vendor data (Bloomberg, Refinitiv, Markit) and underlying data architecture, processes, methodology and systems. Should have strong knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Develop automated frameworks to produce source and target mappings, data load and extraction process, data pre-processing, transformation, integration from various sources and data distribution. Work with business to analyse and understand business requirements and review/produce technical and business specification with focus on reference data modelling. Integrate business requirements into logical solution through qualitative and quantitative data analysis and prototyping. Strong knowledge on overall pricing and static data concepts like different investment types, pricing types, vendor hierarchy, price methodology, market value concept. Analyse complex production issues and provide solution. Produce detailed functional and technical specification documents for development and testing. Hands on experience in working on any ETL tools is mandatory . Strong command of SQL, VBA, and Advance Excel. Understanding of the funds administration industry is necessary. Intermediate knowledge of financial instruments, both listed and unlisted or OTCs which includes and not limited to derivatives, illiquid stocks, private equity, bankdebts, and swaps. Testing and troubleshooting integrations and technical configurations. Effectively multi-task, schedule and prioritize deliverables to meet the project timelines. Ensure operational guidelines are updated & adhere to standards, procedures & also identify plan to mitigate risks wherever there is a control issue. Ability to contribute towards critical projects for product enhancements and efficiency gains. Good understanding of Geneva, Paxus , or any other accounting system. Self - starter with a quick learning ability, possessing strong verbal and written communication skills, and have an ability to present effectively. Maintenance and creation of standard Operating Procedure. Proficiency in an accounting system, preferably Advent Geneva or Paxus would be added advantage. An ability to work under pressure with changing priorities. Experience and Knowledge: 3+ years of related experience in support/ technical in any accounting platform (Paxus/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Excellent communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.

Posted 2 days ago

Apply

7.0 - 10.0 years

10 - 15 Lacs

kolkata, mumbai, new delhi

Work from Office

Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Qualifications: Bachelor s Degree or International equivalent Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 2 days ago

Apply

3.0 - 8.0 years

40 - 45 Lacs

noida

Work from Office

Data Modelling: Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalization and denormalization techniques. Data Warehousing & Storage Solutions : Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics. ETL/ELT Processes : Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, dbt) to facilitate data movement from source systems to the data warehouse. SQL Proficiency : Advanced SQL skills for complex queries, indexing, and performance tuning. Programming Skills : Strong in Python or Java for building custom data pipelines and handling advanced data transformations. Data Integration : Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch. Data Pipeline Management : Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines. APIs and Data Feeds : Knowledgeable in API-based integrations, especially for aggregating data from distributed sources. Responsibilities - Design and implement analytical platforms that provide insightful dashboards to customers. Develop and maintain data warehouse schemas, such as star schemas, fact tables, and dimensions, to support efficient querying and data access. Oversee data propagation processes from source databases to warehouse-specific databases/tools, ensuring data accuracy, reliability, and timeliness. Ensure the architectural design is extensible and scalable to adapt to future needs. Requirement - Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1 Engineering institutes with relevant work experience with a top technology company. 3+ years of Backend and Infrastructure Experience with a strong track record in development, architecture and design. Hands-on experience with large-scale databases, high-scale messaging systems and real-time Job Queues. Experience navigating and understanding large scale systems and complex code-bases, and architectural patterns. Proven experience in building high-scale data platforms. Strong expertise in data warehouse schema design (star schema, fact tables, dimensions). Experience with data movement, transformation, and integration tools for data propagation across systems. Ability to evaluate and implement best practices in data architecture for scalable solutions. Nice to have: Experience with Google Cloud, Django, Postgres, Celery, Redis. Some experience with AI Infrastructure and Operations.

Posted 2 days ago

Apply

5.0 - 6.0 years

18 - 19 Lacs

mumbai, navi mumbai

Work from Office

Minimum Skill Requirement This exciting and interesting position will have the below responsibilities Playing a key liaison role during technology implementation providing direction and requirements clarity and testing new and enhanced capabilities prior to release into production Being able to do requirements gathering campaign Design customer data mapping solution design development and deployment of campaign solutions Contributing to developing marketing capabilities at Mindtree Must have deep understanding of different kind of marketing programs in a multichannel Campaign and Marketing world that clients are leaning towards and how clients measure the returns on investments Must demonstrate understanding of the latest trends in the digital marketing market space including integration with mobile channel data management in the new space integration of online and offline channels and how its enables using some of the technologies Must have experience with developing landing pages and templates Must Have 1 Marketo Lead management experience and hands on data management experience for customer data 2 Must exhibit a through conceptual understanding of the entire marketing promotions process capabilities that support these using one of the above mentioned Technology stack 3 Worked on at least 34 full lifecycle of campaign management Marketo Campaign and other digital marketing tools 4 Overall understanding of marketing cloud data architecture model 5 Must have worked on HTML CSS Java Scripting SQL Bootstrap Responsive 6 Must have adequate knowledge on integrations of Salesforce Websites MDM and other third party tools integration with Marketo and must have knowledge with REST and SOAP API Integration 7 Should be able to gather requirement effectively and provide best practices and solution to client Can work independently to deliver end product 8 Should have working knowledge on tolls like Litmus Good to Have 1 Develop proposals of technical solutions including recommendations on selection architecture licensing configuration sizing and scalability 2 Have worked on any other Custom or packaged Campaign Management product 3 Exposure to advanced Digital marketing skills 4 Administration and installation skills on any one of the marketing Technologies

Posted 2 days ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

kolkata, mumbai, new delhi

Work from Office

Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Qualifications: Bachelor s Degree or International equivalent Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 2 days ago

Apply

5.0 - 7.0 years

5 - 8 Lacs

indore, hyderabad, ahmedabad

Work from Office

Collibra Expert Data Governance (Onsite) Locations: Hyderabad, Indore, Ahmedabad (India) Position Type: Full-time / Onsite Immediate Requirement Role We are seeking a highly skilled Collibra Expert to lead enterprise-level data governance initiatives. The ideal candidate must have strong hands-on Collibra expertise, including configuration, workflow development, integration, and stakeholder engagement, with proven experience in implementing governance frameworks across large organizations. Key Responsibilities Lead end-to-end implementation & administration of the Collibra Data Intelligence Platform. Design & configure Collibra Operating Models (domains, assets, workflows, roles). Develop & maintain custom workflows using BPMN & Collibra Workflow Designer. Integrate Collibra with Snowflake, Informatica, Tableau, Azure, SAP via APIs & connectors. Define & enforce data governance policies with stewards, owners & business teams. Implement & monitor data quality, lineage & metadata management. Act as Collibra SME & evangelist, driving data governance maturity. Provide training & support to technical and business users. Maintain documentation & ensure compliance with governance standards. Required Skills 10+ years in data governance, metadata management, or data quality. 5+ years hands-on Collibra experience (configuration, workflows, integrations). Proficiency with Collibra APIs, BPMN, Groovy, JavaScript. Experience with data cataloging, lineage & business glossary in Collibra. Familiarity with Snowflake, Azure, AWS, Informatica or similar platforms. Strong communication & stakeholder management skills. Preferred Skills Collibra Ranger / Solution Architect certification. Enterprise-level Collibra deployments experience. Knowledge of regulatory compliance (GDPR, HIPAA, CCPA). Background in data architecture / data engineering. Soft Skills Strong leadership & stakeholder collaboration. Excellent problem-solving & analytical mindset. Ability to mentor teams & evangelize data governance practices. Immediate Requirement Candidates must be available for onsite work at Hyderabad, Indore, Ahmedabad with immediate availability. Resume Submission Please share resumes with full details including: Current CTC Expected CTC Notice Period / Immediate Availability Current Location Preferred Job Location Send profiles to: navaneetha@suzva.com

Posted 2 days ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

hyderabad, chennai, bengaluru

Work from Office

Roles and Responsibility Design, develop, and implement data models and architectures to support business intelligence and analytics. Develop and maintain large-scale data systems, ensuring scalability, reliability, and performance. Collaborate with cross-functional teams to identify business requirements and develop solutions. Analyze complex data sets to extract insights and trends, providing actionable recommendations. Ensure data quality, integrity, and security through appropriate validation and testing procedures. Stay up-to-date with industry trends and emerging technologies to continuously improve skills and knowledge. Job Requirements Strong understanding of data analysis principles and techniques, including statistical modeling and machine learning algorithms. Experience working with large datasets, developing and maintaining databases, and implementing data governance policies. Excellent problem-solving skills, with the ability to analyze complex issues and develop creative solutions. Strong communication and collaboration skills, with experience working with diverse stakeholders. Ability to design and implement scalable data architectures, ensuring high performance and reliability. Strong attention to detail, with a focus on delivering high-quality results and meeting deadlines. Location - Bengaluru,Hyderabad,Chennai,Pune

Posted 2 days ago

Apply

3.0 - 6.0 years

6 - 11 Lacs

bengaluru

Work from Office

About the Opportunity Job Type: PermanentApplication Deadline: 30 September 2025 Title : Technical Specialist - Data Modelling Department: Enterprise Data & Analytics Location: Bangalore Reports To: Project Manager Grade/Level : 4 Working Shift : UK Hours (12-21 Hrs IST) Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our Enterprise Data & Analytics team and feel like youre part of something bigger. About your team The Enterprise Data & Analytics team will be responsible for enhancing data governance and maturity by capturing critical data elements and metadata, defining data quality rules, and developing data models. This teams capabilities will encompass Data Management, Data Governance, Data Quality, Data Architecture, and Reporting. Additionally, the team will ensure the adoption of appropriate tools, enforce standards, and deliver foundational capabilities. About your role This role engages across business and technology teams for alignment on the data modelling standard and practices. The role is expected to communicate/collaborate with senior level employees as well as technical product & service owners and be able to work across all levels of the FIL enterprise. This function will be responsible for understanding business, data & technology services with a view to create a strategic approach to data modelling and its adoption. About you Key Responsibilities: Collaborate with stakeholders and cross-functional teams to understand data requirements and design appropriate data models that align with business needs. Create and maintain data dictionaries and metadata repositories to ensure consistency and integrity of data models. Identify and resolve data model performance issues to optimize database performance and enhance overall system functionality. Evaluate and recommend data modelling tools and technologies to improve efficiency and accuracy of data modelling processes. To develop, maintain and promote the Enterprise Data Modelling Standards & Procedures To maintain the Enterprise Data Model Repository, in partnership with the business unit lead data modellers To develop and maintain the Enterprise Data Modelling Operating Model Open to work in UK Hours shift (12-21 Hrs IST) Essential Skills Strong demonstrable experience in developing and implementing data models, particularly dimensional data models and third normal form models with 8-10 years Exp. In same. Project Delivery - candidate should have experience in End-End Model design, Development and Delivery of appropriate Target Data Models on at least 2-3 different projects. Modelling Tool - Working experience on one of the Enterprise modelling tools - ER Studio, Erwin, Power Designer, EA Sparx etc. Good basic understanding of the Asset Management business (2-4 years) Clear understanding of, and the differences between, conceptual, logical and physical models The desire and ability to lead data model reviews and resolve design issues as required The desire and ability to develop and maintain Enterprise Data Modelling Standards & Procedures The desire and ability to work with business unit delivery teams to ensure Enterprise Data Modelling Standards & Procedures are followed. Should be confident & must have excellent communication skills. Excellent opportunity to work on emerging data technologies and learn with highly energised technical and engineering community, creating an environment of collaborative learning, sharing and positive challenge. Work in highly meritocratic set-up where excellence is suitably enabled, supported, and rewarded. Ability to work on mission critical, enterprise grade financial services systems which support millions of customers. Exposure to work towards our technical strategy which has taken on the challenge of relentless simplification, cloud onboarding, modern technologies and ways of working/problem solving.

Posted 2 days ago

Apply

2.0 - 6.0 years

10 - 14 Lacs

hyderabad

Work from Office

The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. Responsible for designing and implementing scalable, secure, and efficient microservices-based solutions using services in Microsoft Azure Platform (Primary Platform). Architectural Design: Engage with Business Stakeholders to understand the requirements and translate it into a scalable, robust Solution Design which will encompass Data Modelling, API Design, Infrastructure Design and App Integration. Experience in building multi-tenant applications will be an added advantage. Cloud First Approach: Design and implement cloud solutions using Azure native resources, including PaaS, SaaS, and IaaS. Collaboration: Work closely with cross-functional teams, including developers, product managers, and operations staff, to ensure seamless integration and alignment with business objectives. Security and Compliance: Ensure that Solution Design is reviewed and approved by the Enterprise Architecture Team and InfoSec team by meeting the compliance requirement standards. Technical Leadership: Be the Technical SME during Requirements Gathering, Issue Triaging meetings, 3rd Party Software evaluations. About You To be considered for this role it is envisaged you will possess the following attributes: Strong hands-on experience in Implementing Microservice based Solutions. Expertise in Azure services like Azure Kubernetes Service (AKS), Azure Functions, Azure Blob Storage, Azure SQL, Azure APIM, Application Gateways. Good understanding of OWASP Vulnerabilities and their remediation. Strong understanding of microservices architecture principles, containerization (e.g., Docker, Kubernetes), and API design. Strong understanding Data Architecture, Data Modelling, Data Management, Data Integration patterns and challenges. Experience with scalable data platforms and solutions integrating and standardising data from different enterprise applications. Additional Advantage: Having one or more certifications as a Solution Architect in any of the leading Cloud Platforms. Experience with EPC(Engineering, Procurement and Construction) Customers Understanding of TOGAF ADM Cycle.

Posted 2 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture and data models.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data processing frameworks and ETL tools.- Experience with data warehousing concepts and technologies.- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.- Knowledge of database management systems and SQL. Additional Information:- The candidate should have minimum 3 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

hyderabad

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data engineering process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data architecture design and implementation.- Optimize data pipelines for performance and scalability.- Implement data security and privacy measures. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Strong understanding of data modeling and database design.- Experience with cloud-based data platforms like AWS or Azure.- Hands-on experience with ETL tools such as Informatica or Talend.- Knowledge of programming languages like Python or Java. Additional Information:- The candidate should have a minimum of 5 years of experience in Data Engineering.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

12.0 - 14.0 years

25 - 30 Lacs

chennai

Work from Office

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 2 days ago

Apply

15.0 - 20.0 years

15 - 19 Lacs

hyderabad

Work from Office

About The Role Project Role : Packaged/SaaS Application Architect Project Role Description : Design scalable, secure, and cost-efficient SaaS solutions to align with enterprise architecture. Apply SaaS principles such as multi-tenancy and modularity, define customization limits, and guide platform use to ensure performance and maintainability. Must have skills : Microsoft Fabric Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Packaged/SaaS Application Architect, you will be responsible for designing scalable, secure, and cost-efficient Software as a Service solutions that align with the overarching enterprise architecture. Your typical day will involve applying SaaS principles such as multi-tenancy and modularity, defining customization limits, and guiding platform use to ensure optimal performance and maintainability. You will collaborate with various teams to ensure that the solutions meet both technical and business requirements, while also addressing any challenges that arise during the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate the performance of the SaaS solutions to ensure they meet established standards. Professional & Technical Skills: - Data Architecture and Modelling:Designing and implementing scalable MDM architectures (Profisee), including data layer design, metadata management, data storage and lifecycle, analytics and ML integrations, data security and semantic models within MS Fabric.- Lakehouse Architecture:Implementing modern data Lakehouse architectures, potentially utilizing Delta Lake for data versioning, ACID transactions, and schema enforcement.- Microsoft Fabric:Deep understanding of Fabric's core components like OneLake, Synapse Data Engineering, Synapse Data Warehousing, Semantic model and Power BI.- Data Governance:Experience Implementing Data Quality Frameworks with Purview for data governance, including data cataloguing, data loss prevention, and auditing.- Programming Languages:Strong skills in SQL and Python for data manipulation, analysis & optimization.- Data Integration:Experience with various data integration patterns and tools, (ETL) using Fabric data pipelines / Data Flow Gen2 and monitoring tools for performance- Documentation all design standards and processes.- Communication and Collaboration:Excellent communication and collaboration skills to work effectively with stakeholders and developers. Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

4 - 6 Lacs

bengaluru

Work from Office

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 2 days ago

Apply

5.0 - 10.0 years

18 - 22 Lacs

pune

Work from Office

We are looking for an Enterprise Architect Youll make a difference by: As member of our global Enterprise Architecture team at SMO IT you play an important role in digital transition and establishing the enterprise function in our company as enabling partner for our business to achieve business goals and position IT as enabler. As part of a global team, you will be responsible for enterprise architecture management (including business IT alignment and analysis of the application portfolio) and derive IT strategies from business requirements. Lead IT transformation programs passionate about optimizing the application portfolio to support business objectives and improve operational efficiency. Ensure interoperability of different applications within the enterprise according to modern integration pattern and domain driven design. Provide architectural guidance and governance for new solutions, services and ensure their compliance with the existing architectural landscape. Define intentional architecture as a purposeful set of statements, models, and decisions that represent some future architectural state, and derive the roadmap to achieve this state. Shape the target operation model on cloud infrastructures together with the cloud competence center Apply and contribute to governance frameworks that ensure compliance, security, and alignment with enterprise architecture principles. Engage with key collaborators to establish IT as enabler and partner for value proposition, and ensure that enterprise architecture initiatives meet business needs. Shift from experience-based to fact-based decision-making by the means of the enterprise repository Youll win us over by: Bachelor/masters degree or equivalent experience in computer science or comparable education with corresponding additional skills Proven experience in enterprise architecture, with a focus on IT transformation programs in enterprise context Proven skills in the field of Enterprise Architecture Management, relevant certifications (e.g., TOGAF, PMP) is a plus. Experience with LeanIX for EAM is a plus Sound knowledge from anyone or more aspect of Business Architecture, Application Architecture, Data Architecture, Technology Architecture. Experience in architecture design and modeling Excellent communication and interpersonal skills, with the ability to actively engage and influence senior leadership and key collaborators. Experience in leading IT projects, including budget management and resource allocation with the ability to align IT initiatives with business objectives. Ability to lead multiple priorities in a fast-paced, dynamic environment, and adapt to changing business needs. Understand the business & technical architecture concepts for sophisticated IT architectures. Knowledge in IT governance, risk management, and emerging technologies; familiarity with application portfolio management tools and techniques is a plus. Experiences in one of our technology domains: IoT, AI, Cloud native ideally in combination with domain expertise in logistics, manufacturing or other subject areas in railways industries

Posted 2 days ago

Apply

10.0 - 12.0 years

35 - 40 Lacs

pune, india

Work from Office

USEReady helps businesses to be self-reliant on data. Growing over 3000% since inception, USEReady achieved #113 rank in Inc 500 2015 and honoured top 100 companies in North America 2015 by Red Herring. USEReady is built on strong entrepreneurial spirit with unprecedented opportunities for career growth. At USEReady, we believe in achieving career growth while improving our individual competencies. If you desire to be part of a team that believes in mutual success and inspiration, you are welcome to apply. USEReady is a data and analytics firm that provides the strategies, tools, capability, and capacity that businesses need to turn their data into a competitive advantage. USEReady partners with cloud and data ecosystem leaders like Tableau, Salesforce, Snowflake and Amazon Web Services, and has been named Tableau partner of the year multiple times. We have been nominated and won several awards along this journey. Check us out at www.useready.com The ideal candidate will be a specialist in Data Entitlement using Apache Ranger within Starburst. The consultant will be responsible for designing the end-to-end solution, ensuring high performance through mechanisms like materialised views , and implementing fine-grained access control to guarantee data security. This is a hands-on architectural role that will directly impact how our organization accesses and consumes data. Key Responsibilities Design and implement a scalable data architecture using Starburst as the central query and virtualisation layer. Take full ownership of data security by designing and implementing fine-grained access control policies in Starburst using Apache Ranger . This includes row-level filtering, column masking, and tag-based policies. Create a unified semantic layer by virtualizing data from various Oracle schemas and other potential data sources, providing a single point of access for business users. Develop and manage materialised views within Starburst to accelerate query performance and ensure a seamless, interactive experience for Tableau users. Leverage your experience with Oracle to effectively connect, query, and model data within the Starburst ecosystem. Work directly with business stakeholders, and Oracle DBAs to understand requirements, translate them into technical solutions, and ensure successful project delivery Note* Role is remote but it is preferable if candidate is in Pune so can visit client office if needed.

Posted 2 days ago

Apply

10.0 - 19.0 years

30 - 40 Lacs

hyderabad

Work from Office

7+ years of experience in an IT infrastructure or cloud engineering role, with at least 5 years focused on AWS architecture. - AWS Certified Solutions Architect (Associate or Professional) or equivalent experience. - Deep expertise in core AWS services, including EC2, S3, VPC, IAM, RDS (specifically Aurora/PostgreSQL), and CloudFormation. - Proven experience designing and implementing secure and scalable cloud environments. - Hands-on experience with Infrastructure as Code (IaC) using CloudFormation or Terraform. - Strong understanding of cloud security principles and best practices. - Architect Cloud Solutions: Design and implement a secure, highly available, and scalable cloud architecture on AWS that meets the technical and business requirements of the CHIP platform. - Infrastructure as Code (IaC): Develop and maintain the cloud infrastructure using AWS CloudFormation, enabling automated provisioning, configuration, and management. - Security and Compliance: Implement and enforce security best practices across the AWS environment, including network design, IAM policies, and data encryption. Ensure the architecture supports GxP and other regulatory compliance requirements. - Database Management: Oversee the setup, configuration, and maintenance of the AWS Aurora (PostgreSQL) database, ensuring optimal performance, security, and availability. - Authentication and Authorization: Architect and manage the integration with AWS ForgeRock for Single Sign-On (SSO) and define access control patterns for the platform. - CI/CD and DevOps: Collaborate with the development team to design and implement a robust CI/CD pipeline using GitHub Actions for automated builds, testing, and deployments. - Cost Optimization and Governance: Monitor cloud resource usage, optimize costs, and establish governance policies to ensure efficient and effective use of AWS services. - Technical Leadership: Provide guidance and mentorship to the development team on AWS best practices and act as the primary point of contact for all infrastructure-related matters.

Posted 2 days ago

Apply

10.0 - 17.0 years

40 - 50 Lacs

navi mumbai, pune, mumbai (all areas)

Hybrid

Job Title : Enterprise Architect (VP2 and above) Location : Airoli Mumbai Department : Technology Grade : VP2 and above Reports To : Chief Architect / CIO Position Purpose The Enterprise Architect will be responsible for defining, governing, and driving the enterprise architecture strategy for the bank. This role will ensure that architecture standards, frameworks, and roadmaps are established, communicated, and adhered to, enabling scalable, secure, and future-ready IT systems that support business objectives. Key Responsibilities Define and maintain enterprise architecture standards for the bank. Work closely with Solution Architects and Technical Architects to ensure adherence to architecture standards. Ensure that architecture artifacts, documentation, and repositories are updated and current. Partner with business and technology teams to develop target state enterprise architectures . Create frameworks, roadmaps, and strategy documents across key technology and business domains. Provide guidance and oversight to Development and Engineering teams. Inform stakeholders about potential risks and issues with current technical solutions. Assess and communicate the business impact of architectural and technology choices. Research emerging technologies and propose adoption where relevant. Lead architecture and design workshops with IT and business leadership. Qualifications & ExperienceEssential Graduate / Postgraduate in Computer Science, Engineering, or related field. Extensive experience in Enterprise Architecture, Solution Architecture, and IT strategy . Proven expertise in creating architecture blueprints, frameworks, and standards . Strong understanding of application, data, infrastructure, and security architectures . Familiarity with TOGAF, Zachman, or other enterprise architecture frameworks . Hands-on experience in cloud (AWS, Azure, GCP), integration, and digital transformation initiatives . Preferred Experience in Banking / Financial Services domain. Enterprise Architecture certifications ( TOGAF / ArchiMate / SABSA ). Strong leadership and stakeholder management skills at the CXO level . Organisation Network & StakeholdersInternal: CIO, CTO, Business Heads, Tech Business Partners Solution Architects, Technical Architects Head of Development & Engineering Head of Infrastructure & Platform Owners External: Consultants, Business Solution Providers, OEMs IT & Services Organizations Technology Service Providers (Cloud, Infra, SaaS vendors) CompetenciesTechnical Skills Enterprise Architecture Strategy IT Architecture Frameworks (TOGAF, Zachman, ArchiMate) Cloud, Infrastructure & Application Architecture Emerging Technologies & Innovation Business-IT Alignment Behavioral Skills Professionalism Good judgment and accountability in decision-making Respect Sensitivity and responsibility in communication and actions Excellence – Consistent delivery of high-quality outcomes Entrepreneurial Mindset – Ownership and enterprising approach Teamwork – Collaborative approach to achieving shared goals Key Skills / Keywords Enterprise Architect | Architecture Strategy | TOGAF | Solution Architecture | Application Architecture | Data Architecture | Cloud Architecture | IT Roadmap | Architecture Governance | Banking Technology | Digital Transformation

Posted 2 days ago

Apply

10.0 - 17.0 years

40 - 50 Lacs

navi mumbai, pune, mumbai (all areas)

Hybrid

ole Enterprise Data Architect Team Enterprise Architecture team Designation Vice President Reporting Chief Architect Team 1-2 Solution Architects Responsibility Maintaining & managing the existing data architecture for the bank and setting up the strategy for the banks data architecture. The design and implementation of the enterprise-wide data strategy, ensuring the strategy supports the current and future business needs. They will apply their knowledge of good architecture practice, architecture documentation and data technologies. To comprehensively capture the bank’s current state holistic data architecture and oversee target state design and implementation. Providing technical architecture direction, oversight and hand-hold delivery teams for the delivery of target state data architecture. The role will involve collaborating with Business heads, Data teams, IT Heads. IT teams and internal and external tech stakeholders (OEM & implementation vendors, partners). To ensure the enterprise data strategy and associated implementation adds value to the business. Drive Innovation in the areas of Data, Analytics and AI/ML Main duties Documenting the detailed holistic enterprise data architecture for both current and target state Capturing the enterprise data model, working with the Data & analytics team to align to the conceptual and logical models. Setting the strategy for data architecture to support the Business and IT strategies and maintaining the data architecture principles. Creating data architecture documents and templates for change initiatives and supporting solution architects to complete. Lead data architecture governance forums and provide subject matter expert input to architecture decisions. Manage holistic roadmap of architecture change initiatives across the bank, coordinating requirements across different initiatives. Provide technical expertise for change initiatives, working with solution architects, technical architects and business product owners. Identifying Innovation in the Data, Analytics and AI/ML area and drive the same to implementation. Build and maintain appropriate Enterprise Architecture artefacts including Entity Relationship Models, Data dictionary, taxonomy to aid data traceability.• Provide technical oversight to Solution Architects in creating business driven solutions adhering to the enterprise architecture and data governance standards. Develop key performance measures for data integration and quality. Working with the wider Enterprise Architecture community to develop the target architecture. Qualification Essential: Educated to degree level or hold equivalent experience. 5+ years’ experience delivering information management solutions to large numbers of end users. 10+ years’ experience in architecting and implementing enterprise data warehouse, Master data Management, data integration, BI & analytics, content management and data management platforms. Experience of creating and implementing data strategies that align with business objectives. Strong knowledge of industry best practices around data architecture in both cloud and on prem solutions. A comprehensive understanding of data warehousing and data transformation (extract, transform and load) processes and the supporting technologies such as Amazon Glue, EMR, Azure Data Factory, Data Lake, and other analytics products. Excellent knowledge of how Analytics & AI/ML solutions work. Excellent problem solving and data modelling skills (logical, physical, sematic and integration models) including normalization, OLAP / OLTP principles and entity relationship analysis. Experience of mapping key Enterprise data entities to business capabilities and applications A strong knowledge of horizontal data lineage from source to output. Excellent communication and presentational skills, confident and methodical approach, and able to work within a team environment. Excellent communication and presentation skills and the confidence to deliver ideas clearly and concisely to stakeholders at all levels of seniority. Desirable: Experience of working in Financial Services. Engineering Graduate. TOGAF Certified or equivalent Certified Data Management Professional (CDMP) or equivalent Data Certification

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies