Home
Jobs

41 Data Mart Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Primary Strong expereince in DW testing Strong Experience in testing ETL jobs Experience in writing test scripts for Java/Python scripts Strong Experience in writing complex SQL queries Strong Understanding of data warehouse concepts and data mart testing; To ensure Date integrity is not compromised Test Case Preparation and Execution Transform complex Business logic into SQL or PL/SQL queries Defect tracking experience (i.e. AzDO, Jira, Rally, etc) Exposure to large data sets and understand Data Quality Framework • Automates applicable test cases for regression testing • Good understanding of software testing methodologies, processes and quality metrics • Performs testing • Programming skills in Java • Identifies defects and work with scrum team on resolution • Experience in Selenium, SoapUI or similar tools or as stated in Program tech stack requirements • Accountable for overall test quality (functional & regression) • Promotes environment of collaboration within the test team • Coordinates test planning and execution (FFT, SIT, CVT; performance tests, load tests, etc.) • Provides reports related to product quality metrics • Provides quality related inputs on go/no-go for every release (incl. promotion to INT, CVT & prod environments) • Attends scrum ceremonies • Updates status in Rally on a daily basis • Ensures test cases cover 100% of new functionality

Posted 2 weeks ago

Apply

6.0 - 10.0 years

2 - 5 Lacs

Chennai

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1999_JOB Date Opened 17/06/2023 Industry Technology Job Type Work Experience 6-10 years Job Title ETL Tester City Chennai Province Tamil Nadu Country India Postal Code 600001 Number of Positions 1 Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 3 weeks ago

Apply

8.0 - 12.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2385_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 8-12 years Job Title Data modeller City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 3 weeks ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Murex Front Office Finance Good to have skills : Murex Back Office Workflows Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will play a crucial role in developing and implementing solutions that enhance business operations and drive efficiency. Roles & Responsibilities: Key liaison with Front Office user base and working with traders/end users to understand their requirements and provide timely solutions Hands-on Knowledge on Rate Curves Setup Hands-on knowledge in MX Market Risk Configure from scratch all FO Modules PreTrade, E-tradepad, Events, Simulation, Market Data etc. Performing detailed P&L, cash flow analysis and understanding of RFR Instruments post Libor Transformation POC for all FO Queries from the User side. Train the traders/end users on Mx.3 FO functionality Professional & Technical Skills: 8+yrs of exp in Murex system-Front office modules of Mx 3.1 platform Deep Understanding of Treasury Product like FX,MM,FI,IRS,Murex FO & risk modules Experience on scalable, resilient transaction processing system in the Financial markets Strong analytical & logical approach to problem solving & system development, trade lifecycle across FO,BO& MO tiers Perform Requirement Analysis in FO space for various asset classes, initial analysis of existing production data/test cases suite Analyse/understand product requirement & offer solution/support to facilitate rollouts Know FO business to design & build pricing/booking capabilities in Murex system Participate with internal business partners on cross functional project to provide STP solution for pricing, distribution & execution capabilities Additional Information: Give STP solution-pricing, distribution/execution capabilities on cross functional project This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 11.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Murex Front Office Finance Good to have skills : Murex Back Office Workflows, AEM 6 Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will play a crucial role in developing and implementing solutions that enhance business operations and drive efficiency. Roles & Responsibilities: Key liaison with Front Office user base and working with traders/end users to understand their requirements and provide timely solutions Hands-on Knowledge on Rate Curves Setup Hands-on knowledge in MX Market Risk Configure from scratch all FO Modules PreTrade, E-tradepad, Events, Simulation, Market Data etc. Performing detailed P&L, cash flow analysis and understanding of RFR Instruments post Libor Transformation POC for all FO Queries from the User side. Train the traders/end users on Mx.3 FO functionality Professional & Technical Skills: 8+yrs of exp in Murex system-Front office modules of Mx 3.1 platform Deep Understanding of Treasury Product like FX,MM,FI,IRS,Murex FO & risk modules Experience on scalable, resilient transaction processing system in the Financial markets Strong analytical & logical approach to problem solving & system development, trade lifecycle across FO,BO& MO tiers Perform Requirement Analysis in FO space for various asset classes, initial analysis of existing production data/test cases suite Analyse/understand product requirement & offer solution/support to facilitate rollouts Know FO business to design & build pricing/booking capabilities in Murex system Participate with internal business partners on cross functional project to provide STP solution for pricing, distribution & execution capabilities Additional Information: Give STP solution-pricing, distribution/execution capabilities on cross functional project This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Naukri logo

Responsibility: Developand set up the transformation of data from sources to enable analysis anddecision making. Maintain data flow from source to the designated target without affecting the crucialdata flow and to play a critical part in the data supply chain, by ensuringstakeholders can access and manipulate data for routine and ad hoc analysis. Implement projects focused on collecting, aggregating, storing, reconciling, and makingdata accessible from disparate sources. Provide support during the full lifecycle of data from ingesting through analytics toaction. Analyzeand organize raw data. Evaluate business needs and objectives. Interprettrends and patterns. Conduct complex data analysis and report on results. Coordinate with source team and end-user and develop solutions. Implementdata governance policies and support data-versioning processes. Maintain security and dataprivacy. Requirements Must Have: Proven hands-on experience inbuilding complex analytical queries in Teradata. 4+ years of extensive programming experience in Teradata Tools and Utilities. Hands-on experience in Teradata utilities such as Fast Load, Multi Load, BTEQ, and TPT. Experience in data quality management and best practices across data solution implementations. Experience in development testing and deployment, coding standards, and best practices. Experience in preparing technical design documentation. Strong team collaboration and experience working with remote teams. Knowledge in data modelling and database management such as performance tuning of the Enterprise Data Warehouse, Data Mart, and Business Intelligence Reporting environments, andsupport the integration of those systems with other applications. Good to have: Should be good in Unix Shellscripting. Experience in DataTransformation using ETL/ELT tools. Experience in differentrelational databases (i.e. Teradata, Oracle, PostgreSQL). experience with CI/CD development and deployment tools (i.e. Maven, Jenkins, Git, Kubernetes).

Posted 3 weeks ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru, Bangalaore

Work from Office

Naukri logo

ETL Data Engineer - Tech Lead Bangalore, India Information Technology 16748 Overview We are seeking a skilled and experienced Data Engineer who has expertise in playing a vital role in supporting data discovery, creating design document, data ingestion/migration, creating data pipelines, creating data marts and managing, monitoring the data using tech stack Azure, SQL, Python, PySpark, Airflow and Snowflake. Responsibilities 1. Data DiscoveryCollaborate with source teams and gather complete details of data sources and create design diagram. 2. Data Ingestion/MigrationCollaborate with cross-functional teams to Ingest/migrate data from various sources to staging area. Develop and implement efficient data migration strategies, ensuring data integrity and security throughout the process. 3. Data Pipeline DevelopmentDesign, develop, and maintain robust data pipelines that extract, transform, and load (ETL) data from different sources into GCP. Implement data quality checks and ensure scalability, reliability, and performance of the pipelines. 4. Data ManagementBuild and maintain data models and schemas, ensuring optimal storage, organization, and accessibility of data. Collaborate with requirement team to understand their data requirements and provide solutions by creating data marts to meet their needs. 5. Performance OptimizationIdentify and resolve performance bottlenecks within the data pipelines and data services. Optimize queries, job configurations, and data processing techniques to improve overall system efficiency. 6. Data Governance and SecurityImplement data governance policies, access controls, and data security measures to ensure compliance with regulatory requirements and protect sensitive data. Monitor and troubleshoot data-related issues, ensuring high availability and reliability of data systems. 7. Documentation and CollaborationCreate comprehensive technical documentation, including data flow diagrams, system architecture, and standard operating procedures. Collaborate with cross-functional teams, analysts, and software engineers, to understand their requirements and provide technical expertise. Requirements Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven experience as a Data Engineer Technical lead, with a focus on output driven. - Strong knowledge and hands-on experience with Azure, SQL, Python, PySpark, Airflow, Snowflake and related tools. - Proficiency in data processing and pipeline development. Solid understanding of data modeling, database design, and ETL principles. - Experience with data migration projects, including data extraction, transformation, and loading. - Familiarity with data governance, security, and compliance practices. - Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. - Excellent communication and interpersonal skills, with the ability to articulate technical concepts to non-technical stakeholders.

Posted 4 weeks ago

Apply

7.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

As a BI Qliksense Developer with a Visualization focus, you will be instrumental in transforming raw data into compelling visual stories that empower our leadership team to make informed strategic decisions. You will leverage your deep expertise in Qlik Sense, data modeling, and business intelligence principles to deliver high-quality, user-centric solutions. Your ability to understand complex business requirements, design intuitive visualizations, and optimize performance will be crucial to your success. You will also contribute to data governance, security, and the overall health of our BI environment. Responsibilities - Design, develop, and maintain complex Qlik Sense reports, dashboards, and applications with a strong focus on delivering insights for Executive and Leadership stakeholders. - Utilize advanced Qlik Sense features and scripting to create highly interactive and insightful visualizations. - Implement On-Premises Qlik Sense architectures, ensuring scalability and stability. - Configure and manage Qlik Sense Enterprise, including security rules and access controls. - Develop and manage Qlik Sense extensions and integrations as needed. - Perform Qlik Sense administration tasks, including user management, license allocation, and system monitoring. - Implement and manage N-Printing for report distribution and scheduling. - Configure and manage Qlik Sense Alerting to proactively notify stakeholders of critical data changes. - Take a proactive and leading role in gathering detailed requirements from product owners, internal analysts, and business stakeholders, particularly focusing on the needs of Executive and Leadership users. - Translate complex business requirements and acceptance criteria into effective and efficient Qlik Sense solutions. -

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Primary Skills Good knowledge and expertise in datamart Knowledge of datamart reporting , feeders, batch of feeders Strong analytical and debugging ability, modifying and enhancing existing complex Datamart objects Good exposure dynamic tables, pre and post filters, feeders, batch of feeders, extractions, reporting tables and processing scripts Experience in Simulation based Reports and Risk Matrix based reports, complex reports would be required Able to configure, execute, and troubleshoot batch reports in MXG Able to design and optimize the usage of dynamic tables Experience of Sybase DB or oracle database Experience in Datamart/EOD solution design and effort estimation with limited support required Knowledgeable in Unix shell scripting Knowledge and hands on experience in implementing, developing and supporting MX.3 End of Day processes and Trade Life Cycle Management using workflows engine. (At least 5 years) Experience in leading the integration stream for a MX.3 implementation for Integration or Reporting stream, including leading and coordinating Design sessions and sprint showsessions. Strong Knowledge of SQL/RDBMS technology Secondary Skills Good to have knowledge on GOM definition, MxML development. Experience on different Asset classes, Trade workflows, trade attributes and Financial and Non-Financial static data Good understanding of both exchange traded and OTC Derivatives with specific focus on Credit and Rates products with clarity on their life cycle Understanding of Murex BO functionalities like Confirmation and settlements. Skills (competencies) Verbal Communication Written Communication

Posted 1 month ago

Apply

2.0 - 5.0 years

2 - 4 Lacs

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Work from Office

Naukri logo

Role & responsibilities 3 to 4+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud and Data Governance domains. • Take ownership of the technical aspects of implementing data pipeline & migration requirements, ensuring that the platform is being used to its fullest potential through designing and building applications around business stakeholder needs. • Interface directly with stakeholders to gather requirements and own the automated end-to-end data engineering solutions. • Implement data pipelines to automate the ingestion, transformation, and augmentation of both structured, unstructured, real-time data, and provide best practices for pipeline operations • Troubleshoot and remediate data quality issues raised by pipeline alerts or downstream consumers. Implement Data Governance best practices. • Create and maintain clear documentation on data models/schemas as well as transformation/validation rules. • Implement tools that help data consumers to extract, analyze, and visualize data faster through data pipelines. • Implement data security, privacy, and compliance protocols to ensure safe data handling in line with regulatory requirements. • Optimize data workflows and queries to ensure low latency, high throughput, and cost efficiency. • Leading the entire software lifecycle including hands-on development, code reviews, testing, deployment, and documentation for batch ETL's. • Work directly with our internal product/technical teams to ensure that our technology infrastructure is seamlessly and effectively integrated • Migrate current data applications & pipelines to Cloud leveraging technologies in future Preferred candidate profile • Graduate with Engineering Degree (CS/Electronics/IT) / MCA / MCS or equivalent with substantial data engineering experience. • 3+ years of recent hands-on experience with a modern programming language (Scala, Python, Java) is required; Spark/ Pyspark is preferred. • Experience with configuration management and version control apps (ie: Git) and experience working within a CI/CD framework is a plus. • 3+ years of recent hands-on SQL programming experience in a Big Data environment is required. • Working knowledge of PostgreSQL, RDBMS, NoSQL and columnar databases. • Experience developing and maintaining ETL applications and data pipelines using big data technologies is required; Apache Kafka, Spark, Airflow experience is a must. • Knowledge of API and microservice integration with applications. • Experience with containerization (e.g., Docker) and orchestration (e.g., Kubernetes). • Experience building data solutions for Power BI and Web visualization applications. • Experience with Cloud is a plus. • Experience in managing multiple projects and stakeholders with excellent communication and interpersonal skills. • Ability to develop and organize high-quality documentation. • Superior analytical skills and a strong sense of ownership in your work. • Collaborate with data scientists on several projects. Contribute to development and support of analytics including AI/ML. • Ability to thrive in a fast-paced environment, and to manage multiple, competing priorities simultaneously. • Prior Energy & Utilities industry experience is a big plus. Experience (Min. Max. in yrs.): 3+ years of core/relevant experience Location: Mumbai (Onsite)

Posted 1 month ago

Apply

7.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role As a BI Qliksense Developer with a Visualization focus, you will be instrumental in transforming raw data into compelling visual stories that empower our leadership team to make informed strategic decisions. You will leverage your deep expertise in Qlik Sense, data modeling, and business intelligence principles to deliver high-quality, user-centric solutions. Your ability to understand complex business requirements, design intuitive visualizations, and optimize performance will be crucial to your success. You will also contribute to data governance, security, and the overall health of our BI environment. Responsibilities - Design, develop, and maintain complex Qlik Sense reports, dashboards, and applications with a strong focus on delivering insights for Executive and Leadership stakeholders. - Utilize advanced Qlik Sense features and scripting to create highly interactive and insightful visualizations. - Implement On-Premises Qlik Sense architectures, ensuring scalability and stability. - Configure and manage Qlik Sense Enterprise, including security rules and access controls. - Develop and manage Qlik Sense extensions and integrations as needed. - Perform Qlik Sense administration tasks, including user management, license allocation, and system monitoring. - Implement and manage N-Printing for report distribution and scheduling. - Configure and manage Qlik Sense Alerting to proactively notify stakeholders of critical data changes. - Take a proactive and leading role in gathering detailed requirements from product owners, internal analysts, and business stakeholders, particularly focusing on the needs of Executive and Leadership users. - Translate complex business requirements and acceptance criteria into effective and efficient Qlik Sense solutions. - Apply expert knowledge of Business Intelligence, Data Modeling (dimensional and relational), and Data Warehousing concepts to design robust Qlik Sense data models. - Build efficient BI data models within Qlik Sense, ensuring data accuracy, performance, and usability. - Implement and enforce data security and data Governance policies within the Qlik Sense environment. - Manage data integrations and data preparations within Qlik Sense, leveraging various data sources. - Possess a strong understanding of traditional Data warehouses and Data Marts and how they integrate with Qlik Sense. - Demonstrate expert knowledge of authoring and troubleshooting complex SQL queries to extract and prepare data for Qlik Sense. - Optimize SQL queries for performance and efficiency. Connectivity, Data Modeling & Source Data Procurement - Exhibit expert knowledge and application of general and tool-specific connectivity methods to various data sources (databases, flat files, APIs, etc.). - Configure and manage Qlik Sense security settings to ensure appropriate data access and governance. - Apply expert-level performance tuning techniques within Qlik Sense applications and the overall environment. - Demonstrate expert knowledge and application of using the right visuals, optimizing layout for clarity and impact, ensuring mobility and responsiveness, and integrating advanced analytics into Qlik Sense - Design intuitive and user-friendly dashboards tailored for Executive and Leadership consumption. - Apply strong Business and Data analysis skills to understand underlying data and business processes. - Possess a working understanding of Trading concepts, particularly Market Risk (added advantage). - Maintain knowledge of upcoming trends in BI tools and technology, particularly within the Qlik ecosystem. - Demonstrate familiarity with user-centered design and testing methodologies. - Address usability and accessibility concerns in Qlik Sense development. - Possess strong communication skills to effectively engage with technical developers, architects, and business stakeholders, including Executive and Leadership teams. - Exhibit good project management and organizational skills to manage development tasks and timelines effectively. - Be self-motivated, proactive in taking initiative, and innovative in finding solutions. - Demonstrate the ability to drive and mentor junior team members. - Must have experience in working with AGILE and KANBAN methodologies. - Be capable of running and facilitating sprint activities. Requirements - Minimum of 7 years of overall experience in the IT industry. - Expert-level knowledge and extensive experience in Qlik Sense development and administration. - Proven experience implementing On-Premises Qlik Sense architectures. - Hands-on experience with N-Printing and Qlik Sense Alerting. - Demonstrated experience in designing and developing Qlik Sense reports and dashboards specifically for Executive and Leadership use cases. - Expert knowledge and practical application of data visualization best practices. - Expert knowledge of Business Intelligence, Data Modeling, and Data Warehousing principles. - Proven ability to gather and translate business requirements into technical solutions. - Expert experience in building BI data models, implementing data security, and adhering to data Governance policies. - Experience with data integrations and data preparation within Qlik Sense. - Expert knowledge of authoring and troubleshooting complex SQL queries. - Proven ability to perform performance tuning on Qlik Sense applications and environments. - Excellent verbal and written communication skills. - Good project management and organizational skills. - Experience working with AGILE and KANBAN methodologies. - Self-motivated with a strong ability to take initiative. Location Bangalore Notice Period Immediate Joiner Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Job Title: ======= Senior MS BI Developer Onsite Location: ============= Dubai, UAE Doha , Qatar Riyadh, Saudi Onsite Monthly Salary: ============== 10k AED - 15k AED - Full tax free salary , Depending on Experience Gulf work permit will be sponsored by our client Project duration: ============= 2 Years, Extendable Desired Experience Level Needed: =========================== 5 - 10 Years Qualification: ========== B.Tech / M.Tech / MCA / M.Sc or equivalent Experience Needed: =============== Over all: 5 or more Years of total IT experience Solid 3+ Years experience or MS - BI Developer with Microsoft Stack / MS - DWH Engineer Job Responsibilities: ================ - Design and develop DWH data flows - Able to build SCD -1 / SCD - 2 / SCD -3 dimensions - Build Cubes - Maintain SSAS / DWH data - Design Microsoft DWH & its ETL packages - Able to code T-SQL - Able to create Orchestrations - Able to design batch jobs / Orchestrations runs - Familiarity with data models - Able to develop MDM (Master Data Management) Experience: ================ - Experience as DWH developer with Microsoft DWH data flows and cubes - Exposure and experience with Azure services including Azure Data Factory - Sound knowledge of BI practices and visualization tools such as PowerBI / SSRS/ QlikView - Collecting / gathering data from various multiple source systems - Creating automated data pipelines - Configuring Azure resources and services Skills: ================ - Microsoft SSIS / SSAS / SSRS - Informatica - Azure Data Factory - Spark - SQL Nice to have: ========== - Any on site experience is added advantage, but not mandatory - Microsoft certifications are added advantage Business Vertical: ============== - Banking / Investment Banking - Capital Markets - Securities / Stock Market Trading - Bonds / Forex Trading - Credit Risk - Payments Cards Industry (VISA/ Master Card/ Amex) Job Code: ====== MSBI_DEVP_0525 No.of positions: ============ 05 Email: ===== spectrumconsulting1977@gmail.com if you are interested, please email your CV as ATTACHMENT with job ref. code [ MSBI_DEVP_0525 ] as subject

Posted 1 month ago

Apply

9 - 11 years

37 - 40 Lacs

Ahmedabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

10 - 15 years

30 - 35 Lacs

Noida

Remote

Naukri logo

SR. DATA MODELER FULL-TIME ROLE REMOTE OR ONSITE Job Summary: We are seeking an experienced Data Modeler to support the Enterprise Data Platform (EDP) initiative, focusing on building and optimizing curated data assets on Google BigQuery. This role requires expertise in data modeling, strong knowledge of retail data, and an ability to collaborate with data engineers, business analysts, and architects to create scalable and high-performing data structures. Required Qualifications: 5+ years of experience in data modeling and architecture in cloud data platforms (BigQuery preferred). Expertise in dimensional modeling (Kimball), data vault, and normalization/denormalization techniques. Strong SQL skills, with hands-on experience in BigQuery performance tuning (partitioning, clustering, query optimization). Understanding of retail data models (e.g., sales, inventory, pricing, supply chain, customer analytics). Experience working with data engineering teams to implement models in ETL/ELT pipelines. Familiarity with data governance, metadata management, and data cataloging. Excellent communication skills and ability to translate business needs into structured data models. Key Responsibilities: 1. Data Modeling & Curated Layer Design Design logical, conceptual, and physical data models for the EDPs curated layer in BigQuery. Develop fact and dimension tables, ensuring adherence to dimensional modeling best practices (Kimball methodology). Optimize data models for performance, scalability, and query efficiency in a cloud-native environment. Work closely with data engineers to translate models into efficient BigQuery implementations (partitioning, clustering, materialized views). 2. Data Standardization & Governance Define and maintain data definitions, relationships, and business rules for curated assets. Ensure data integrity, consistency, and governance across datasets. Work with Data Governance teams to align models with enterprise data standards and metadata management policies. 3. Collaboration with Business & Technical Teams Engage with business analysts and product teams to understand data needs, ensuring models align with business requirements. Partner with data engineers and architects to implement best practices for data ingestion and transformation. Support BI & analytics teams by ensuring curated models are optimized for downstream consumption (e.g., Looker, Tableau, Power BI, AI/ML models, APIs). Please share the following details along with the most updated resume to geeta.negi@compunnel.com if you are interested in the opportunity: Total Experience Relevant experience Current CTC Expected CTC Notice Period (Last working day if you are serving the notice period) Current Location SKILL 1 RATING OUT OF 5 SKILL 2 RATING OUT OF 5 SKILL 3 RATING OUT OF 5 (Mention the skill)

Posted 1 month ago

Apply

6 - 11 years

14 - 24 Lacs

Bengaluru

Remote

Naukri logo

Role & responsibilities MUST HAVE Knowledge :- Data Warehousing (Redshift , Azure Synpase) Database :- Oracle/SQL server/ etc Data Modeling (Erwin/DBVm) Data lake Data mart SQL Good to have - Nosql (Cosmos /dynamo/mongo) JD for Data Manager The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. The successful candidate will: - Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). - Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities - Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). - Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. - Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. - Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills - Bachelors or master’s degree in computer/data science technical or related experience. - 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). - Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. - Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. - Experience in team management, communication, and presentation. Preferred candidate profile

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies