Jobs
Interviews

42 Dwh Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

1 - 4 Lacs

Noida

Hybrid

Position Overview The role would be to help architect, build, and maintain a robust, scalable, and sustainable businessintelligence platform. Assisted by the Data Team this role will work with highly scalable systems, complex data models, and a large amount of transactional data. Company Overview BOLD is an established and fast-growing product company that transforms work lives. Since 2005,weve helped more than 10,000,000 folks from all over America(and beyond!) reach higher and dobetter. A career at BOLD promises great challenges, opportunities, culture and the environment. Withour headquarters in Puerto Rico and offices in San Francisco and India, were a global organization on a path to change the career industry Key Responsibilities Architect, develop, and maintain a highly scalable data warehouse and build/maintain ETL processes. Utilize Python and Airflow to integrate data from across the business into data warehouse. Integrate third party data into the data warehouse like google analytics, google ads, Iterable Required Skills Experience working as an ETL developer in a Data Engineering, Data Warehousing or Business Intelligence team Understanding of data integration/data engineering architecture and should be aware of ETL standards, methodologies, guidelines and techniques Hands on with python programming language and its packages like Pandas, NumPy Strong understanding of SQL queries, aggregate functions, Complex joins and performance tuning Should have good exposure of Databases like Snowflake/SQL Server/Oracle/PostgreSQL(any one of these) Broad understanding of data warehousing and dimensional modelling concept

Posted 21 hours ago

Apply

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, ETL, and related tools. With a minimum of 5 years of experience in Data Engineering, you have expertise in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This role offers you an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Your responsibilities will include: - Developing and maintaining data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes across various data sources. - Writing complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Implementing advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and designing high-performance data architectures. - Collaborating with business stakeholders to understand data needs and translating business requirements into technical solutions. - Performing root cause analysis on data-related issues, ensuring effective resolution, and maintaining high data quality standards. - Working closely with cross-functional teams to integrate data solutions and creating clear documentation for data processes and models. Your qualifications should include: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Proficiency in using Power BI for data visualization and business intelligence reporting. - Familiarity with Sigma Computing, Tableau, Oracle, DBT, and cloud services like Azure, AWS, or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education required for this role is a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. This position will be based in Bangalore, Chennai, Kolkata, or Pune. If you meet the above requirements and are passionate about data engineering and analytics, this is an excellent opportunity to leverage your skills and contribute to impactful data solutions.,

Posted 1 day ago

Apply

6.0 - 11.0 years

0 - 0 Lacs

bangalore, kolkata, mumbai city

On-site

Technical Skills - Experience in data warehousing and business intelligence with emphasis on business requirements analysis, application design, development, testing and support. Experience in Cognos Analytics 11 (Data Modules, Framework Manager Packages, Report Studio, Visualization Gallery) Basic knowledge of Extract, Transform, Load (ETL) processes Good knowledge in Cognos packages, and reports using Framework Manager and Report Studio. Design and develop report via Drill Through, List, Crosstab and Prompt pages, Page grouping & sections Soft skills- Excellent communication and collaboration skills Ability to work independently and as a part of a team Adaptability to changing business requirements Cognos certification is a plus

Posted 2 days ago

Apply

4.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Big Data Lead with 7-12 years of experience, you will be responsible for leading the development of data processing systems and applications, specifically in the areas of Data Warehousing (DWH). Your role will involve utilizing your strong software development skills in multiple computing languages, with a focus on distributed data processing systems and BIDW programs. You should have a minimum of 4 years of software development experience and a proven track record in developing and testing applications, preferably on the J2EE stack. A sound understanding of best practices and concepts related to Data Warehouse Applications is crucial for this role. Additionally, you should possess a strong foundation in distributed systems and computing systems, with hands-on experience in Spark & Scala, Kafka, Hadoop, Hbase, Pig, and Hive. Experience with NoSQL data stores, data modeling, and data management will be beneficial for this role. Strong interpersonal communication skills are essential, along with excellent oral and written communication abilities. Knowledge of Data Lake implementation as an alternative to Data Warehousing is desirable. Hands-on experience with Spark SQL and SQL proficiency are mandatory requirements for this role. You should have a minimum of 2 end-to-end implementations in either Data Warehousing or Data Lake projects. Your role as a Big Data Lead will involve collaborating with cross-functional teams and driving data-related initiatives to meet business objectives effectively.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled professional, you must possess hands-on experience in Databricks, with a strong emphasis on DWH development. Your proficiency in Pyspark and architecture, coupled with robust SQL and PLSQL knowledge, will be advantageous. Your responsibilities in this role include developing Databricks DWH, showcasing your expertise in all modules of the Databricks suite, and contributing to framework and pipeline design. Your ability to create complex queries and packages using SQL and PLSQL will be crucial. To excel in this position, you should be familiar with Agile and DevOps methodologies, possess excellent attention to detail, and thrive in a collaborative team environment. Meeting deliverables within short sprints and demonstrating strong communication and documentation skills are essential for success. Key Skills: PLSQL, framework development, Pyspark, architecture, DWH, Agile methodologies, SQL, design, Databricks, DevOps, pipeline design, communication skills, documentation skills.,

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for designing, developing, and maintaining interactive dashboards and reports using Qlik Sense. This includes extracting data, managing Qlik Sense servers, and ensuring data integrity as well as performance optimization. Your main focus will be on developing innovative and visually appealing Qlik Sense dashboards and reports that provide actionable insights to stakeholders. To be successful in this role, you should have at least 10 years of experience in Data Warehousing, with 5-6 years specifically in implementing visually appealing Qlik Sense dashboards. You must be proficient in data transformation, creation of QVD files, and set analysis. Additionally, you should have experience in application designing, architecting, development, and deployment using Qlik Sense, with expertise in front-end development and visualization best practices. Strong database designing and SQL skills are essential, along with experience in data integration through ETL processes. You will be required to translate complex functional, technical, and business requirements into executable architectural designs. Collaboration with data architects and business stakeholders to understand data requirements and provide technical solutions is a key aspect of the role. You will lead the end-to-end system and architecture design for applications and infrastructure. Experience in working with various chart types in Qlik Sense such as KPI, Line, Straight table, Pivot table, Pie, Bar, Combo, Radar, and Map is crucial. Proficiency in SET Analysis or Set Expressions, as well as knowledge on creating YTD, LYTD, QTD, LQTD, MTD, LMTD, WTD, LWTD using Set Analysis is required. Familiarity with Qlik Native Functions like String, Date, Aggregate, Row, Conditional, as well as working knowledge on Qlik Sense extensions such as Vizlib and Climber is preferred. Experience with Master Items, Variables, and Segments creation is a plus. You should have a strong understanding of optimization techniques for front-end dashboards, as well as knowledge on Mashups and development. Thorough testing and debugging to ensure the accuracy, reliability, and performance of Qlik applications will be part of your responsibilities. If you have the technical skills and experience required for this role and are passionate about creating insightful and visually appealing dashboards and reports using Qlik Sense, we encourage you to apply.,

Posted 5 days ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: PAN India Skill: Azure Data Factory/SSIS Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Data Factory: Rel Exp in Synapse: Rel Exp in SSIS: Rel Exp in Python/Pyspark: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Bengaluru

Remote

LEAD ANALYST: As a Lead Analyst , you will play a strategic role in leading data-driven consulting engagements, designing advanced analytics solutions, and delivering actionable insights to clients. You will collaborate with cross-functional teams, manage BI projects, and enable clients to make data-backed business decisions. Key Responsibilities: Client Consulting & Strategy Partner with clients to understand business challenges, define business objectives, and develop data-driven strategies. Translate business problems into analytics solutions by leveraging BI dashboards, predictive modelling, and AI-driven insights. Act as a trusted advisor by delivering compelling presentations and actionable recommendations to senior stakeholders. Business Intelligence & Data Visualization Design, develop, and manage scalable BI dashboards and reporting solutions using tools like Power BI and Tableau. Drive data accuracy, consistency, and security in reporting solutions across different client engagements. Enable self-service BI for clients by setting up robust data visualization and exploration frameworks. Advanced Analytics & Insights Generation Perform deep-dive analysis on business performance metrics, customer behaviour, and operational trends. Define, develop and track key performance indicators (KPIs) to measure business success and identify improvement opportunities. Project & Stakeholder Management Lead multiple analytics and BI projects, ensuring timely delivery and alignment with client expectations. Work cross-functionally with data engineers, business consultants, and technology teams to deliver holistic solutions. Communicate findings through executive reports, data stories, and interactive presentations. Team Leadership & Development Build and grow a team of BI developers, data analysts, and business consultants. Foster a data-driven culture by providing training and upskilling opportunities for internal teams. Contribute to thought leadership by publishing insights, whitepapers, and case studies. Key Qualifications & Skills: • Education : Bachelor's or Masters degree in Business Analytics, Data Science, Computer Science, or a related field.• Experience : 6+ years in business intelligence, analytics, or data consulting roles. • Technical Expertise : Strong proficiency in SQL, Python, Excel, and other data manipulation techniques. Hands-on experience with BI tools like Power BI/Tableau. Knowledge of data engineering and data modelling concepts, ETL processes, and cloud platforms (Azure/AWS/GCP). Familiarity with predictive modelling and statistical analysis. Consulting & Business Acumen: Strong problem-solving skills and ability to translate data insights into business impact. Experience working in a consulting environment, managing client relationships and expectations. Excellent communication and storytelling skills, leveraging PowerPoint to present complex data insights effectively. Project & Stakeholder Management: Ability to manage multiple projects and collaborate across teams in a fast-paced environment. Strong leadership and mentorship capabilities, fostering a culture of learning and innovation LEAD BUSINESS ANALYST: We are seeking a highly experienced and strategic Lead Business Analyst with over 10 years of proven expertise in business analysis, data analytics, and project delivery. The ideal candidate will have deep knowledge in risk, data governance, and KPI frameworks, with a successful track record of driving complex data-driven projects, compliance transformations, and performance automation. --- Key Responsibilities Business Analysis & Strategy Collaborate with stakeholders to gather, define, and analyze business requirements across projects. Develop Business Requirement Documents (BRDs) and functional specifications aligned with business goals. Project Delivery & Data Analytics Lead cross-functional teams to deliver data-centric projects such as scorecard creation, dashboards, and EDW redesign. Manage end-to-end project lifecycle, ensuring timely delivery of business insights and performance dashboards. Process Optimization & Automation Drive process enhancements by automating KPIs, Daily reports, and workflows. Conduct gap analysis, root cause analysis, and impact assessments to improve decision-making accuracy. Stakeholder & Client Engagement Serve as a point of contact for internal and external stakeholders, ensuring business objectives are translated into actionable analytics. Deliver high-impact demos and training sessions to clients and internal teams. --- Key Requirements 10+ years of experience in business analysis, preferably in EDW projects. Hands-on expertise with data analytics, data quality assessment, and KPI frameworks Technical proficiency in SQL Server, PowerBI/Tableau, Jira Strong documentation, stakeholder management. Experience with AI/ML product features and data governance practices is a plus --- Key Competencies Strategic Thinking and Problem Solving Strong Analytical and Communication Skills Agile and Cross-functional Team Leadership Data Strategy, Quality, and Visualization Critical Thinking and Decision-Making

Posted 1 week ago

Apply

5.0 - 8.0 years

12 - 18 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 5-8 yrs Location: Gurugram/Bangalore Skill: Azure Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Databricks: Rel Exp in Pyspark: Rel Exp in DWH: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 1 week ago

Apply

7.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Posted 2 weeks ago

Apply

10.0 - 18.0 years

0 - 3 Lacs

Hyderabad

Work from Office

Greetings from Cognizant!!! We have an exciting opportunity for the skill Azure infrastructure with Cognizant, if you are an aspirant for matching the below criteria apply with us immediately!! Skill: Azure Data Factory Experience: 11 to 18 years Location: Hyderabad Notice Period: immediate to 30 days Interview mode : Virtual Required Qualifications: AZ Data Engineer profiles, who are strong in AZ ADF, Snowflake, SQL and DWH Concepts.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

22 - 25 Lacs

Bengaluru

Work from Office

We are looking for energetic, self-motivated and exceptional Data engineers to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with a star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into databases from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business processes and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have the ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance

Posted 3 weeks ago

Apply

7.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Data Analyst (Data Analyst Corporate Technology Data Engineering & Analytics)-(Full-Time, Hyderabad) The Opportunity Join our dynamic team as a Data Analyst – Corporate Technology Data Engineering & Analytics, where you'll play a pivotal role in driving the execution of our Data strategy. This role is crucial in driving digital transformation and operational efficiency across Investment Management. As part of this role, you will lead to extracting value from data by facilitating the creation of high-quality data solutions that drive decision-making and operational efficiency. You’ll use your skills to provide subject matter expertise and complete in-depth data analysis, which contributes to the strategic efforts of the team. • Analyze data related to Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, Pricing to generate actionable insights. • Develop and maintain comprehensive data mapping documents and work closely with data engineering teams to ensure accurate data integration and transformation. • Partner with Business Analysts, Architects and Data engineers to validate datasets, optimize queries and perform reconciliation. • Support the design and delivery of Investment data and reporting solutions, including data pipelines, reporting dashboards. • Collaborate with Data Engineers, Data Architects, and BI developers to ensure design and development of scalable data solutions aligning with business goals. • Manage and oversee investment data, ensuring its accuracy, consistency, and completeness. The Minimum Qualifications Education: Bachelors or Master s degree in Finance, Computer Science, Information Systems or related field. Experience: • 7-9 years of experience as a Data Analyst or similar role supporting data analytics projects. • 5+ years of Mastery in SQL. • 5+ years of experience in financial services, insurance, or related industry. • Experience with data manipulation using Python. • Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, Pricing. • Investment Operations exposure - Critical Data Elements (CDE), data traps and other data recons. • Familiarity with data engineering concepts: ETL/ELT, data lakes, data warehouses. • Experience with BI tools like Power BI, MicroStrategy, Tableau. • Excellent communication, problem-solving, and stakeholder management skills. • Experience in Agile/Scrum and working with cross-functional delivery teams. • Proficiency in financial reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: • Familiarity with regulatory requirements and compliance standards in the investment management industry. • Ability to lead cross-functional teams and manage complex projects. • Hands-on experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. • Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. • Experience with cloud data platforms like Snowflake and Databricks. • Background in data governance, metadata management, and data lineage frameworks. Soft Skills: • Exceptional communication and interpersonal skills. • Ability to influence and motivate teams without direct authority. • Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. • Regular meetings with the Corporate Technology leadership team • Focused one-on-one meetings with your manager • Access to mentorship opportunities • Access to learning content on Degreed and other informational platforms • Your ethics and integrity will be valued by a company with a strong and stable ethical business with industry leading pay and benefits

Posted 3 weeks ago

Apply

8.0 - 13.0 years

40 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

We are looking for "Sr. Snowflake Developer/Specialist/Architect" with Minimum 8 years experience Contact- Atchaya (95001 64554) Required Candidate profile Snowflake + Advance SQL (5+ yrs) + DWH (Data warehouse) 8+ years of industry experience with hands-on managing projects in Data Warehousing. Minimum 4 years of experience working on Snowflake.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

1 - 4 Lacs

Noida

Hybrid

Position Overview The role would be to help architect, build, and maintain a robust, scalable, and sustainable businessintelligence platform. Assisted by the Data Team this role will work with highly scalable systems, complex data models, and a large amount of transactional data. Company Overview BOLD is an established and fast-growing product company that transforms work lives. Since 2005,weve helped more than 10,000,000 folks from all over America(and beyond!) reach higher and dobetter. A career at BOLD promises great challenges, opportunities, culture and the environment. Withour headquarters in Puerto Rico and offices in San Francisco and India, were a global organization on a path to change the career industry Key Responsibilities Architect, develop, and maintain a highly scalable data warehouse and build/maintain ETL processes. Utilize Python and Airflow to integrate data from across the business into data warehouse. Integrate third party data into the data warehouse like google analytics, google ads, Iterable Required Skills Experience working as an ETL developer in a Data Engineering, Data Warehousing or Business Intelligence team Understanding of data integration/data engineering architecture and should be aware of ETL standards, methodologies, guidelines and techniques Hands on with python programming language and its packages like Pandas, NumPy Strong understanding of SQL queries, aggregate functions, Complex joins and performance tuning Should have good exposure of Databases like Snowflake/SQL Server/Oracle/PostgreSQL(any one of these) Broad understanding of data warehousing and dimensional modelling concept

Posted 3 weeks ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

The Snowflake Data Specialist will manage projects in Data Warehousing, focusing on Snowflake and related technologies. The role requires expertise in data modeling, ETL processes, and cloud-based data solutions.

Posted 3 weeks ago

Apply

6.0 - 9.0 years

0 - 3 Lacs

Pune, Chennai, Bengaluru

Hybrid

Experience - 6 years - 9 years Location - Pune / Chennai / Mumbai / Bangalore Strong in-depth knowledge of databases and database concepts to support Design/development of Datawarehouse/Datalake application Analyze the business requirements and work with the business and data modeler to support dataflow from source to target destination Follow release and change processes: distribution of software builds and releases to development , test environments and production Adheres to project's SDLC process (Agile), participates in Team discussion, scrum call, and works collaboratively with internal and external team members Develop ETL using Ab Initio, Databases: MS SQL server, Bigdata, text/excel files Should engage in the intake/release/change/incident/problem management processes Should be able to prioritize and drive all the relevant support priorities including (Incident, change, problem, knowledge, engagement with projects ) Develop and document a high-level Conceptual Data Process Design for review by Architect, data analysts that will serve as a basis for writing ETL code and designing test plans Thoroughly unit test ETL code to ensure error free/efficient delivery Analyze several aspects of code prior to release to ensure that it will run efficiently and can be supported in the production environment. Should be able to provide data modeling solutions.

Posted 1 month ago

Apply

6.0 - 10.0 years

6 - 16 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role : AWS Redshift Ops + PLSQL + Unix No of years experience :6+ Detailed Job description - Skill Set: Incident Management Troubleshooting issues Contributing to development Collaborating with another team Suggesting improvements Enhancing system performance Training new employees Mandatory Skills : AWS Redshift PLSQL Apache Airflow Unix ETL DWH

Posted 1 month ago

Apply

13.0 - 20.0 years

40 - 45 Lacs

Bengaluru

Work from Office

Principal Architect - Platform & Application Architect Experience 15+ years in software/data platform architecture 5+ years in architectural leadership roles Architecture & Data Platform Expertise Education Bachelors/Master’s in CS, Engineering, or related field Title: Principal Architect Location: Onsite Bangalore Experience: 15+ years in software & data platform architecture and technology strategy Role Overview We are seeking a Platform & Application Architect to lead the design and implementation of a next-generation, multi-domain data platform and its ecosystem of applications. In this strategic and hands-on role, you will define the overall architecture, select and evolve the technology stack, and establish best practices for governance, scalability, and performance. Your responsibilities will span across the full data lifecycle—ingestion, processing, storage, and analytics—while ensuring the platform is adaptable to diverse and evolving customer needs. This role requires close collaboration with product and business teams to translate strategy into actionable, high-impact platform & products. Key Responsibilities 1. Architecture & Strategy Design the end-to-end architecture for a On-prem / hybrid data platform (data lake/lakehouse, data warehouse, streaming, and analytics components). Define and document data blueprints, data domain models, and architectural standards. Lead build vs. buy evaluations for platform components and recommend best-fit tools and technologies. 2. Data Ingestion & Processing Architect batch and real-time ingestion pipelines using tools like Kafka, Apache NiFi, Flink, or Airbyte. Oversee scalable ETL/ELT processes and orchestrators (Airflow, dbt, Dagster). Support diverse data sources: IoT, operational databases, APIs, flat files, unstructured data. 3. Storage & Modeling Define strategies for data storage and partitioning (data lakes, warehouses, Delta Lake, Iceberg, or Hudi). Develop efficient data strategies for both OLAP and OLTP workloads. Guide schema evolution, data versioning, and performance tuning. 4. Governance, Security, and Compliance Establish data governance , cataloging , and lineage tracking frameworks. Implement access controls , encryption , and audit trails to ensure compliance with DPDPA, GDPR, HIPAA, etc. Promote standardization and best practices across business units. 5. Platform Engineering & DevOps Collaborate with infrastructure and DevOps teams to define CI/CD , monitoring , and DataOps pipelines. Ensure observability, reliability, and cost efficiency of the platform. Define SLAs, capacity planning, and disaster recovery plans. 6. Collaboration & Mentorship Work closely with data engineers, scientists, analysts, and product owners to align platform capabilities with business goals. Mentor teams on architecture principles, technology choices, and operational excellence. Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 12+ years of experience in software engineering, including 5+ years in architectural leadership roles. Proven expertise in designing and scaling distributed systems, microservices, APIs, and event-driven architectures using Java, Python, or Node.js. Strong hands-on experience with building scalable data platforms on premise/Hybrid/cloud environments. Deep knowledge of modern data lake and warehouse technologies (e.g., Snowflake, BigQuery, Redshift) and table formats like Delta Lake or Iceberg. Familiarity with data mesh, data fabric, and lakehouse paradigms. Strong understanding of system reliability, observability, DevSecOps practices, and platform engineering principles. Demonstrated success in leading large-scale architectural initiatives across enterprise-grade or consumer-facing platforms. Excellent communication, documentation, and presentation skills, with the ability to simplify complex concepts and influence at executive levels. Certifications such as TOGAF or AWS Solutions Architect (Professional) and experience in regulated domains (e.g., finance, healthcare, aviation) are desirable.

Posted 1 month ago

Apply

5.0 - 10.0 years

13 - 22 Lacs

Pune

Work from Office

Role & responsibilities Solid understanding of testing principles, testing types, and methodologies Strong knowledge of SQL and ETL Testing. Strong SQL skills - ability to write and interpret complex SQL queries. Good understanding or proficiency in any one programming language ( Python or Java ) Extensive experience in ETL/DW backend testing & BI Intelligence Report testing Hands-on experience in identifying bugs in complex ETL data processing. Must have experience working with Test management tools like ALM, Rally, JIRA etc Experience with any scheduling tools, with preference given to those familiar with Cloud Composer (Airflow) or CTM Proficiency in GCP, AWS, Azure, or any other cloud platforms . Frontend automation using java/selenium or any other tool. Familiar with Security Testing. Strong comprehension, analytical and problem solving skills. Good working experience in Agile methodologies and usage of tools like Rally & JIRA Excellent written and verbal communications skills

Posted 1 month ago

Apply

4.0 - 6.0 years

0 - 3 Lacs

Pune, Chennai, Bengaluru

Hybrid

Experience - 4 years - 6 years Location - Chennai / Pune / Mumbai / Bangalore Strong in-depth knowledge of databases and database concepts to support Design/development of Datawarehouse/Datalake application Analyze the business requirements and work with the business and data modeler to support dataflow from source to target destination. Follow release and change processes: distribution of software builds and releases to development , test environments and production. Adheres to project's SDLC process (Agile), participates in Team discussion, scrum call, and works collaboratively with internal and external team members. Develop ETL using Ab Initio, Databases: MS SQL server, Bigdata, text/excel files. Should engage in the intake/release/change/incident/problem management processes. Should be able to prioritize and drive all the relevant support priorities including (Incident, change, problem, knowledge, engagement with projects ). Develop and document a high-level Conceptual Data Process Design for review by Architect, data analysts that will serve as a basis for writing ETL code and designing test plans. Thoroughly unit test ETL code to ensure error free/efficient delivery. Analyze several aspects of code prior to release to ensure that it will run efficiently and can be supported in the production environment. Should be able to provide data modeling solutions. Kindly share your updated resume to AISHWARYAG5@hexaware.com

Posted 1 month ago

Apply

7.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 27 Lacs

Pune

Work from Office

Mandatory Primary Skills: Python, Pyspark & SQLSecondary Skills: Any Cloud exp, DWH, BI tools (Qlik, PowerBI etc.)

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 27 Lacs

Pune

Work from Office

Mandatory Primary Skills: Python, Pyspark & SQLSecondary Skills: Any Cloud exp, DWH, BI tools (Qlik, PowerBI etc.)

Posted 1 month ago

Apply

8.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

What you will do Lets do this. Lets change the world. In this vital role you will responsible for developing and maintaining the overall IT architecture of the organization. This role involves defining the architecture vision, creating roadmaps, and ensuring that IT strategies align with business goals. You will be working closely with collaborators to understand requirements, develop architectural blueprints, and ensure that solutions are scalable, secure, and aligned with enterprise standards. Architects will be involved in defining the enterprise architecture strategy, guiding technology decisions, and ensuring that all IT projects adhere to established architectural principles. Roles & Responsibilities: Develop and maintain the enterprise architecture vision and strategy, ensuring alignment with business objectives Create and maintain architectural roadmaps that guide the evolution of IT systems and capabilities Establish and enforce architectural standards, policies, and governance frameworks Evaluate emerging technologies and assess their potential impact on the enterprise/domain/solution architecture Identify and mitigate architectural risks, ensuring that IT systems are scalable, secure, and resilient Maintain comprehensive documentation of the architecture, including principles, standards, and models Drive continuous improvement in the architecture by finding opportunities for innovation and efficiency Work with collaborators to gather and analyze requirements, ensuring that solutions meet both business and technical needs Evaluate and recommend technologies and tools that best fit the solution requirements Ensure seamless integration between systems and platforms, both within the organization and with external partners Design systems that can scale to meet growing business needs and performance demands Develop and maintain logical, physical, and conceptual data models to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Contribute to a program vision while advising and articulating program/project strategies on enabling technologies Provide guidance on application and integration development best practices, Enterprise Architecture standards, functional and technical solution architecture & design, environment management, testing, and Platform education Drive the creation of application and technical design standards which leverage best practices and effectively integrate Salesforce into Amgens infrastructure Troubleshoot key product team implementation issues and demonstrate ability to drive to successful resolution. Lead the evaluation of business and technical requirements from a senior level Review releases and roadmaps from Salesforce and evaluate the impacts to current applications, orgs, and solutions. Identification and pro-active management of risk areas and commitment to seeing an issue through to complete resolution Negotiate solutions to complex problems with both the product teams and third-party service providers Build relationships and work with product teams; contribute to broader goals and growth beyond the scope of a single or your current project What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelors degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Experience with SFDC Service Cloud/ Health Cloud in a call center environment Strong architectural design and modeling skills Extensive knowledge of enterprise architecture frameworks and methodologies Experience with system integration, IT infrastructure Experience directing solution design, business processes redesign and aligning business requirements to technical solutions in a regulated environment Experience working in agile methodology, including Product Teams and Product Development models Extensive hands-on technical and solution implementation experience with the Salesforce Lightning Platform, Sales Cloud and Service Cloud, demonstrating positions of increasing responsibility and management/mentoring of more junior technical resources Demonstrable experience and ability to develop custom configured, Visualforce and Lightning applications on the platform. Demonstrable knowledge of the capabilities and features of Service Cloud and Sales Cloud. Demonstrable ability to analyze, design, and optimize business processes via technology and integration, including leadership in guiding customers and colleagues in rationalizing and deploying emerging technology for business use cases A thorough understanding of web services, data modeling, and enterprise application integration concepts, including experience with enterprise integration tools (ESBs and/or ETL tools), and common integration design patterns with enterprise systems (e.g. CMS, ERP, HRIS, DWH/DM) Demonstrably excellent, context-specific and adaptive communication and presentation skills across a variety of audiences and situations; established habit of proactive thinking and behavior and the desire and ability to self-start/learn and apply new technologies Preferred Qualifications: Strong solution design and problem-solving skills Solid understanding of technology, function, or platform Experience in developing differentiated and deliverable solutions Ability to analyze client requirements and translate them into solutions Professional Certifications Salesforce Admin Advanced Admin Platform Builder Salesforce Application Architect (Mandatory) Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies