Home
Jobs

185 Mdx Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

JOB Description Develop and Maintain business intelligence data visualizations using Microsoft Power BI The core responsibilities will be to develop high performance data visualizations, reports, dashboards, and provide ad-hoc reporting. Demonstrate a proven ability to work in a fast-paced, agile environment Be comfortable with initial ambiguity and offer solutions to lead a Client down the right path Support rapid prototyping and coordinate business requests and specifications with a Business Analytics Designer and Customers Validate that visualizations perform as expected in an operational setting Ensure developed visualizations are high quality and comply standards around font, color, whitespace or other branding elements Develop semantic layer organization that will be visible to end-users A successful candidate will have the following characteristics Proficiency with all areas Power BI, Tableau SSRS , MSSQL Deep understanding of Relational Models, Tabular Models, and Multi-Dimensional Models Ability to enhance models as needed for custom Visualization requests Strong SQL work experience and understanding – must have The core responsibilities will be to develop high performance data visualizations, reports, dashboards, and provide ad-hoc reporting. Demonstrate a proven ability to work in a fast-paced, agile environment Be comfortable with initial ambiguity and offer solutions to lead a Client down the right path Support rapid prototyping and coordinate business requests and specifications with a Business Analytics Designer and Customers

Posted 1 day ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Kochi

Work from Office

Naukri logo

Your Role and Responsibilities You, the ideal candidate, are expected to have strong technical, critical thinking and communication skills. You are creative and are not afraid of bringing forward ideas and running with them. If you are already product focused, are excited for new technological development that will help users do better in solving their problems, enjoy and appreciate teamwork with people across the globe, then you will be at home with our team. As a key member of our dynamic team, you will play a vital role in crafting exceptional software experiences. Your responsibilities will encompass the design and implementation of innovative features, fine-tuning and sustaining existing code for optimal performance, and guaranteeing top-notch quality through rigorous testing and debugging. Collaboration is at the heart of what we do, and you’ll be working closely with fellow developers, designers, and product managers to ensure our software aligns seamlessly with user expectations. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 7+ years of Developing High Performance, Highly Scalable C/C++ Application. Multi-threaded Programming, High Performance Data Structures and Algorithms. Experience developing and debugging software across multiple platforms including Microsoft Windows and Linux. Experience with Agile Software Development. Preferred technical and professional experience Degree in Computer Science, Engineering, or equivalent professional experience. In Addition to the required skills, knowledge of MDX, OLAP Technologies and Multidimensional Modeling are a plus.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

Job requisition ID :: 85111 Date: Jun 26, 2025 Location: Delhi Designation: Senior Consultant Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The team Enterprise technology has to do much more than keep the wheels turning; it is the engine that drives functional excellence and the enabler of innovation and long-term growth.Learn more about ET&P Your work profile About the job Functional Consultant The role requires deep analytical, technical, and complex problem-solving skills with knowledge of optimization methods, financial computations, statistical analysis, and advanced mathematical modeling techniques. It also requires exposure to data science, and associated software and programming technologies. Additionally, this role requires exposure to advanced planning systems and tools. This position is responsible for solution delivery, analysis, and interpretation of data to assist management and leadership in rapid decision making and optimizing their supply chain to better serve their customers and shareholders. The solution delivery is through a variety of activities including process design, data analytics, solution configuration and deployment, setting up performance metrics and new policies, testing, and knowledge management. The role would be responsible for participating in requirement and design sessions with the customer; analyzing areas of improvement opportunities; collecting and analyzing data to provide decision support information; preparing business impact case study; working with onshore and offshore teams to configure the solution and finally creating the test-cases to ensure the solution works as per the design. What you’ll do for us… The responsibilities include end-to-end solution design, configuration, implementation, data analytics, testing of solution and communication with internal and external stakeholders. Design Participate in process and business requirements sessions with client and document to-be business process leveraging industry best practices. Work with client to identify and collect data, such as historical sales, shipment, inventory, logistics, and other operations / supply chain data from sources like databases, Excel sheets, emails, and others. Ability to convert business logic to technical platform design, including knowledge of platform infrastructure. Configuration Work closely with architects and directors to develop clear functional and technical design, document data requirements, and build complex datasets. Configure a technical specification document and tool configuration in the o9 platform, as per the design, to solve deep operations / supply chain problems and institute rigorous performance monitoring systems. Data Analytics Use mathematical models, predictive methods, statistical techniques, optimization algorithms and simulations to analyze, manipulate and interpret large enterprise data and provide business insights and data visualization to the client management. Be proficient in statistical and optimization tools and programming languages to conduct data integration through extraction, transformation, and loading (ETL) and create models to generate time series forecasts and operational plans. Testing Work with internal Research and Development teams to resolve solution gaps and deploy fixes in the customer environment. Create and execute workflow and data analytics test-cases, document issues, and track progress at resolving issues. Ability to design and implement a testing protocol, with support from junior analysts, with the end goal of automating testing. Communication Work with client, cross-functional teams, and IT and business stakeholders, to ensure successful planning and execution of project. Plan, develop and deliver Super User and End User training, for a global user base. Mentor junior analysts to familiarize them with technical and business aspects of a project. What you’ll have... Education: Bachelor's or Master’s Degree in Operations Research, Industrial Engineering, Engineering Management, Business Analytics, Computer Science, or related fields with a concentration in operations or analytics Experience: 3 years of experience presenting on complex topics in a clear, concise, and easily understood manner Prior experience in planning systems, and exposure to ERP tools is preferred Firsthand experience leading a team through the full lifecycle of a supply chain planning solution implementation including business requirement gathering, solution design & development, UAT/SIT, go-live/cutover and value realization is preferred Experience using agile methodology to deliver large scale enterprise implementations is preferred Skills and Abilities: Statistical, Optimization and Simulation skills through software tools and packages like R, SAS or similar is required Knowledge of spreadsheets and software (Microsoft Excel, Google Sheets), document processing (Microsoft Word, Google Docs), and presentation (Microsoft PowerPoint) is required Knowledge and training in databases (SQL Server, MySQL) and skills in one or more languages like SQL, MDX, T-SQL or similar is preferred Strong analytical techniques, data mining knowledge and proficiency in handling and processing large amounts of data Ability to identify key insights and critical thinking to prioritize and focus on the highest value opportunities or the biggest risks Coursework and strong background in mathematics and statistics Strong verbal, written, presentation and demonstration / training skills are required Ability to communicate mathematical, technical or software usage concepts to audiences with limited prior mathematics, technical or software background Ability to work in teams, distributed across locations and time zones and at executive and junior levels in a corporate hierarchy English: business communication level How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 1 day ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description The Data Engineer will own the data infrastructure for the Reverse Logistics Team which includes collaboration with software development teams to build the data infrastructure and maintain a highly scalable, reliable and efficient data system to support the fast growing business. You will work with analytic tools, can write excellent SQL scripts, optimize performance of SQL queries and can partner with internal customers to answer key business questions. We look for candidates who are self-motivated, flexible, hardworking and who like to have fun. About The Team Reverse Logistics team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics platform. As a member of this team, your mission will be to design, develop, document and support massively scalable, distributed data warehousing, querying and reporting system. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Knowledge of AWS Infrastructure Knowledge of writing and optimizing SQL queries in a business environment with large-scale, complex datasets Strong analytical and problem solving skills. Curious, self-motivated & a self-starter with a ‘can do attitude’. Comfortable working in fast paced dynamic environment Preferred Qualifications Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics Proven track record of strong interpersonal and communication (verbal and written) skills. Experience developing insights across various areas of customer-related data: financial, product, and marketing Proven problem solving skills, attention to detail, and exceptional organizational skills Ability to deal with ambiguity and competing objectives in a fast paced environment Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing and operations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3001619

Posted 1 day ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Amazon strives to be the world's most customer-centric company, where customers can research and purchase anything they might want online. We set big goals and are looking for people who can help us reach and exceed them. The CPT Data Engineering & Analytics (DEA) team builds and maintains critical data infrastructure that enhances seller experience and protects the privacy of Amazon business partners throughout their lifecycle. We are looking for a strong Data Engineer to join our team. The Data Engineer I will work with well-defined requirements to develop and maintain data pipelines that help internal teams gather required insights for business decisions timely and accurately. You will collaborate with a team of Data Scientists, Business Analysts and other Engineers to build solutions that reduce investigation defects and assess the health of our Operations business while ensuring data quality and regulatory compliance. The ideal candidate must be passionate about building reliable data infrastructure, detail-oriented, and driven to help protect Amazon's customers and business partners. They will be an individual contributor who works effectively with guidance from senior team members to successfully implement data solutions. The candidate must be proficient in SQL and at least one scripting language (e.g. Python, Perl, Scala), with strong understanding of data management fundamentals and distributed systems concepts Key job responsibilities Build and optimize physical data models and data pipelines for simple datasets Write secure, stable, testable, maintainable code with minimal defects Troubleshoot existing datasets and maintain data quality Participate in team design, scoping, and prioritization discussions Document solutions to ensure ease of use and maintainability Handle data in accordance with Amazon policies and security requirements Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3018752

Posted 2 days ago

Apply

7.0 - 12.0 years

13 - 17 Lacs

Gurugram

Work from Office

Naukri logo

Project description We are looking for experienced BI Developers with strong expertise in SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services) to join our team. You will work on developing, implementing, and optimizing Business Intelligence solutions for our clients, contributing to data integration, analysis, and reporting requirements. The role demands a deep understanding of database management, data warehousing concepts, and hands-on experience with the Microsoft BI Stack. Responsibilities Provide technical expertise to the team. Delivery results based on requirements set by the team lead. Provide oversight for the quality of the application and artifacts related to the application analysis and development. Collaborate with the production support team to implement sound change planning and change management. Be proactive in uncovering risk, manage and drive risks and issues understanding business impact. Strive for Best Practice in design and development. Contribute to the technology strategic agenda by driving application architecture simplicity and by minimizing production incidents through high quality development and processes. Be an active participant in regular task elaboration, prioritization and estimation workshops. Ensure Documentation is up-to-date and relevant. Build effective relationships with key Business Stakeholders. Overall Excellent people and communication skills. Excellent Team Player and driven to succeed Establish trust and credibility with stakeholders Embrace and be a role model for Enterprise Behaviours @ NAB. Comfortable assuming ownership for their work, and working independently with minimal supervision. Experience with requirements gathering as part of a team using agile development methods, e.g. User Stories. Confident working through various aspects of technology solutions (e.g. user interfaces, databases, system integration) and dealing with technology specialists Able to quickly build an understanding of the business environment, operating model, and terminology Solid understanding and experience in business process analysis, as-is and to-be process design, process decomposition. Able to apply varying techniques to model and communicate Strong problem-solving and troubleshooting. Focus on attention to detail. Ensuring relevant standards are applied to requirements deliverables. Provide professional and ethical behavior in your actions by ensuring compliance with external legislation, bank standards, and internal operating policies and procedures relevant to the position Skills Must have 7+ years of hands-on experience in BI development with a focus on SSIS and SSAS. Familiarity with all aspects of SDLC. Detailed experience with SQL Server, Analysis Services, Integration Services, Reporting Services (SSRS and PowerBI), and MDX queries for Cubes. Experience in SSAS multi-cube Excellent system design skills in a SQL Server Business Intelligence. Experienced with Source control GIT, Jenkins.Domain Knowledge Knowledge of Banking, Markets / Treasury products highly Desirable. Ability to be able to handle the complexity and dynamic nature of the Financial Services environment, requirement applications to adapt, be flexible, and learn quickly in a complex environment. Nice to have Experience with other BI tools such as Power BI or Tableau. Knowledge of data warehousing concepts and technologies (e.g., Azure Data Factory, Snowflake, or Google BigQuery). Familiarity with Agile methodologies and DevOps practices for CI/CD in BI development. Knowledge of MDX (Multidimensional Expressions) and DAX (Data Analysis Expressions). Experience in automating and scheduling jobs using SQL Server Agent or third-party tools. Exposure to cloud-based BI solutions like Azure Synapse Analytics or AWS Redshift. Understanding of financial data and reporting requirements. Other Languages EnglishB2 Upper Intermediate Seniority Senior

Posted 2 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

5+ Years TM1 Development experience Expertise in developing end to end solutions from ground up in TM1 Capability to Develop TM1 objects (Cubes, Dimensions, Business Rules) without using wizards Experience with Tm1 performance optimisation (feeder & skip checks) Expert in IBM Planning Analytics 2.0 Strong knowledge of MDX, SQL, Excel, Visual Basic, relational databases Proficiency with TM1 security functionality and is able to design a security layer for the data model Must have good verbal communication skills Ability to work across geographies Can work independently without supervision A self-motivated, confident team player who leads by example and provides guidance to others DevOps and Agile engineering practitioner, test driven development Excellent communication skills & team player"

Posted 2 days ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Mumbai, Hyderabad, Gurugram

Work from Office

Naukri logo

Responsibilities: Contribute in design and development of Oracle EPM applications, including modules such as Planning and Budgeting, Financial Consolidation and Close, Profitability and Cost Management, and Strategic Modelling. Collaborate with cross-functional teams, business stakeholders, and solution architects to gather requirements, define technical solutions, and provide guidance on EPM best practices. Architect scalable and high-performance Essbase BSO (Block Storage) and ASO (Aggregate Storage) cubes, optimizing outline design, calculation scripts, and report scripts for efficient data aggregation and analysis. Leverage advanced calculation scripts, business rules, and integration techniques to enhance system functionalities and meet complex business requirements. Develop custom scripts and extensions using languages like MaxL, MDX, Java, or Groovy to automate tasks, enhance data integration, and extend EPM application capabilities. Design and implement complex financial models, frameworks, and planning forms using Oracle EPM tools to enable accurate and efficient data entry, consolidation, and reporting. Lead code reviews, provide technical guidance, and mentor junior developers to ensure adherence to coding standards, enhance code quality, and foster professional growth. Troubleshoot and resolve complex technical issues, utilizing advanced debugging techniques and leveraging in-depth knowledge of the EPM platform. Keep abreast of the latest trends and advancements in the Oracle EPM ecosystem, actively incorporating industry best practices into development approaches. Collaborate with business users to provide support, guidance, and training on Oracle EPM applications, ensuring effective utilization and user satisfaction. Drive continuous improvement in EPM methodologies, processes, and technical frameworks to optimize system performance, data integrity, and user experience. Collaborate with stakeholders across departments, including business analysts, IT teams, and senior management, to understand their needs, provide guidance, and ensure project success. Mentor and train junior developers, promoting knowledge sharing, providing guidance, and fostering team growth. Effectively communicate with project stakeholders, providing technical recommendations, progress updates, and presenting complex concepts in a clear and concise manner. Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. 5-10 years of hands-on experience in Oracle EPM development and implementation. Strong expertise in Oracle EPM applications, including Planning, Essbase, Financial Consolidation and Close, Profitability and Cost Management, and Strategic Modelling. Participated in large-scale and complex implementations of various Oracle EPM modules, including Planning and Budgeting Extensive experience in designing, and optimizing Essbase cubes, calculation scripts, report scripts, and outline management. Expert in scripting languages such as MaxL, MDX, Java, or Groovy to develop custom functionalities, automate tasks, and extend EPM capabilities. In-depth knowledge of financial modelling, planning, budgeting, and consolidation processes. Strong problem-solving, analytical, and troubleshooting skills to resolve complex technical issues. Excellent leadership, communication, and collaboration skills to effectively interact with stakeholders at all levels. Proven experience leading development projects, conducting code reviews, and mentoring junior team members. Ability to work independently, manage multiple priorities, and deliver high-quality solutions within project timelines. Strong commitment to staying updated on the latest trends, best practices, and emerging technologies in the Oracle EPM ecosystem. Location- Remote Chennai, TN, Gurugram, HR, Hyderabad, TS, Mumbai, MH, New Delhi, DL, Pune, MH

Posted 3 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description The Data Engineer will own the data infrastructure for the Reverse Logistics Team which includes collaboration with software development teams to build the data infrastructure and maintain a highly scalable, reliable and efficient data system to support the fast growing business. You will work with analytic tools, can write excellent SQL scripts, optimize performance of SQL queries and can partner with internal customers to answer key business questions. We look for candidates who are self-motivated, flexible, hardworking and who like to have fun. About The Team Reverse Logistics team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics platform. As a member of this team, your mission will be to design, develop, document and support massively scalable, distributed data warehousing, querying and reporting system. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Knowledge of AWS Infrastructure Knowledge of writing and optimizing SQL queries in a business environment with large-scale, complex datasets Strong analytical and problem solving skills. Curious, self-motivated & a self-starter with a ‘can do attitude’. Comfortable working in fast paced dynamic environment Preferred Qualifications Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics Proven track record of strong interpersonal and communication (verbal and written) skills. Experience developing insights across various areas of customer-related data: financial, product, and marketing Proven problem solving skills, attention to detail, and exceptional organizational skills Ability to deal with ambiguity and competing objectives in a fast paced environment Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing and operations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A2998295

Posted 3 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description The Data Engineer will own the data infrastructure for the Reverse Logistics Team which includes collaboration with software development teams to build the data infrastructure and maintain a highly scalable, reliable and efficient data system to support the fast growing business. You will work with analytic tools, can write excellent SQL scripts, optimize performance of SQL queries and can partner with internal customers to answer key business questions. We look for candidates who are self-motivated, flexible, hardworking and who like to have fun. About The Team Reverse Logistics team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics platform. As a member of this team, your mission will be to design, develop, document and support massively scalable, distributed data warehousing, querying and reporting system. Basic Qualifications 2+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Knowledge of AWS Infrastructure Knowledge of writing and optimizing SQL queries in a business environment with large-scale, complex datasets Strong analytical and problem solving skills. Curious, self-motivated & a self-starter with a ‘can do attitude’. Comfortable working in fast paced dynamic environment Preferred Qualifications Bachelor's degree in a quantitative/technical field such as computer science, engineering, statistics Proven track record of strong interpersonal and communication (verbal and written) skills. Experience developing insights across various areas of customer-related data: financial, product, and marketing Proven problem solving skills, attention to detail, and exceptional organizational skills Ability to deal with ambiguity and competing objectives in a fast paced environment Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing and operations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A2998296

Posted 3 days ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Sr Power BI Developer1 Power BI Sr Developer Overall Experience 5+ years of experience in MSBI Product suite (Power BI and DAX) Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful insights in Power BI and has worked in SSIS Experience in requirement analysis, design and prototypingExperience in building enterprise models using Power BI desktop.Strong understanding of Power BI ApplicationDevelop data models, OLAP cubes, and reports utilizing Power BI applying best practices to the development lifecycle. Documentation of source-to-target mappings, data dictionaries, and database design. Identify areas of improvement to optimize data flows.Good Exposure to DAX queries in Power BI desktop.Creation of Power BI dashboard , report , KPI scorecard and transforming the manual reports, support Power BI dashboard deployment.Strong exposure to Visualization , transformation, data analysis and formatting skills.Connecting to data sources, importing data and transforming data for Business Intelligence.Experience in publishing and scheduling Power BI reportsArchitect and develop data models, OLAP cubes, and reports utilizing Power BI applying best practices to the development lifecycle.Documentation of source-to-target mappings, data dictionaries, and database design.Identify areas of improvement to optimize data flows.Installation and Administration of Microsoft SQL Server.Support business development efforts (proposals and client presentations).Knowledge on EBS Modules like Finance, HCM, Procurement will be an added advantage. Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success.Excellent leadership and interpersonal skills.Eager to contribute in a team-oriented environment.Strong prioritization and multi-tasking skills with a track record of meeting deadlines.Ability to be creative and analytical in a problem-solving environment.Effective verbal and written communication skills.Adaptable to new environments, people, technologies, and processesAbility to manage ambiguity and solve undefined problems

Posted 4 days ago

Apply

7.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction A career in IBM Software means you’ll be part of a team that transforms our customer’s challenges into solutions. Seeking new possibilities and always staying curious, we are a team dedicated to creating the world’s leading AI-powered, cloud-native software solutions for our customers. Our renowned legacy creates endless global opportunities for our IBMers, so the door is always open for those who want to grow their career. We are seeking a skilled back-end developer to join our IBM Software team. As part of our team, you will be responsible for developing and maintaining high-quality software products, working with a variety of technologies and programming languages. IBM’s product and technology landscape includes Research, Software, and Infrastructure. Entering this domain positions you at the heart of IBM, where growth and innovation thrive IBM Planning Analytics® is an enterprise financial planning software platform used by a significant number of Global 500 companies. IBM Planning Analytics® provides a real-time approach to consolidating, viewing, and editing enormous volumes of multidimensional data. At the heart of the IBM Planning Analytics solution is TM1® Server, a patented, 64-bit, in-memory functional database server that can perform real-time complex calculations and aggregations over massive data spaces while allowing concurrent data editing. IBM TM1 Server development team is a dynamic and forward thinking team, and we are looking for a Senior Software Developer with significant experience in designing and developing enterprise-scale software products to join us. Your Role And Responsibilities Your Role and Responsibilities You, the ideal candidate, are expected to have strong technical, critical thinking and communication skills. You are creative and are not afraid of bringing forward ideas and running with them. If you are already product focused, are excited for new technological development that will help users do better in solving their problems, enjoy and appreciate teamwork with people across the globe, then you will be at home with our team. As a key member of our dynamic team, you will play a vital role in crafting exceptional software experiences. Your responsibilities will encompass the design and implementation of innovative features, fine-tuning and sustaining existing code for optimal performance, and guaranteeing top-notch quality through rigorous testing and debugging. Collaboration is at the heart of what we do, and you’ll be working closely with fellow developers, designers, and product managers to ensure our software aligns seamlessly with user expectations. Preferred Education Master's Degree Required Technical And Professional Expertise 7+ years of Developing High Performance, Highly Scalable C/C++ Application. Multi-threaded Programming, High Performance Data Structures and Algorithms. Experience developing and debugging software across multiple platforms including Microsoft Windows and Linux. Experience with Agile Software Development. Preferred Technical And Professional Experience Degree in Computer Science, Engineering, or equivalent professional experience. In Addition to the required skills, knowledge of MDX, OLAP Technologies and Multidimensional Modeling are a plus.

Posted 4 days ago

Apply

5.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models), SQL, MDX/DAX, and data modeling. The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities - Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). - Build and optimize MDX or DAX queries for advanced reporting needs. - Create and manage data models (Star/Snowflake schemas) supporting business KPIs. - Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). - Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. - Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. - Maintain data quality and consistency across data sources and reporting layers. - Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary : Required Skills : - SSAS Tabular & Multidimensional. - SQL Server (Advanced SQL, Views, Joins, Indexes). - DAX & MDX. - Data Modeling & OLAP concepts. Secondary : - ETL Tools (SSIS or equivalent). - Power BI or similar BI/reporting tools. - Performance tuning & troubleshooting in SSAS and SQL. - Version control (TFS/Git), deployment best practices.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Welcome to the AP Moller Maersk! AP Moller Maersk is a $81.5 billion global shipping & logistics leader. Maersk is a Danish business conglomerate founded in 1904, with activities in the transport and logistics and energy sectors. Maersk has been the largest container ship and supply vessel operator in the world since 1996. The company is based in Copenhagen, Denmark with subsidiaries and offices across 130 countries and around 110,000 employees. Maersk's Vision: Improving lives for all, by integrating the world. To know more about everything that Maersk does, visit us at www.maersk.com. Job Title –Financial Analyst Job Location - Pune Key Responsibilities Include The job of this role is designing, building, and deploying reports and dashboards using Microsoft Power BI/Report Server/Report Builder/Excel. Develop and maintain Financial and management reports. Stakeholder management. Build automated reports and dashboards with the help of Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. Be experienced in tools and systems on MS SQL Server, including SSRS and TSQL, Power Query, MDX, Power BI, and DAX Be able to quickly shape data into reporting and analytics solutions. Have knowledge of database fundamentals such as multidimensional database design, relational database design, and more. Ability to communicate with business as well as technical teams Ability to learn and quickly respond to rapidly changing business environment Be up to date about the best practices and advancements in development and design Documenting, designing, and modelling solutions and explaining, representing, and discussing the same with the team Applying experience and knowledge to future solution considerations Have an analytical and problem-solving mindset and approach Continuous improvement, self-motivated and eager to learn Team player and initiator Required Experience & Skills Minimum 3+ years of progressive experience and demonstrated growth in Financial Analyst/Analytics profile. Experience in Financial and Management reporting. Be experienced in tools and systems on MS SQL Server, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX An independent, self-motivated individual with a positive, service-oriented attitude Self-starter with a great work ethic and an analytical thinker with superior problem solving and decision-making skills, possessing the initiative to create presentations and analysis from scratch to answer questions posed and investigate the details and interpret the impact of key business drivers Extremely detail-oriented, organized, and keen to details Significant expertise in setting up automated processes to increase the effectiveness of the team and minimize room for error Strong communicator, both written & verbal; reliable and responsive to email and tele-communications Highest level of integrity and good judgment, with the ability to effectively deal with highly sensitive, confidential information Open to learn new skills/ technologies as may be required by business Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 5 days ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Experience - 6+ Years Location: Gurgaon (Hybrid) Budget: 15-18 LPA Roles and Responsibilities ‍Formulate automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. You should be familiar with SSRS and TSQL, Power Query, MDX, PowerBI, and DAX are just a few of the tools and systems on the MS SQL Server BI Stack. Exhibit a foundational understanding of database concepts such relational database architecture, multidimensional database design, and more Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs, and choose a deadline for work completion. Make charts and data documentation thamore.cludes descriptions of the techniques, parameters, models, and relationships. Developing Power BI desktop to create dashboards, KPI scorecards, and visual reports. Establish row-level security on data and comprehend Power BI's application security layer models. Examine, comprehend, and study business needs as they relate to business intelligence. Design and map data models to transform raw data into insightful information. Create dynamic and eye-catching dashboards and reports using Power BI. Make necessary tactical and technological adjustments to enhance current business intelligence systems Integrate data, alter data, and connect to data sources for business intelligence. ‍ ‍ Requirements and Skills ‍Extremely good communication skills are necessary to effectively explain the requirements between both internal teams and client teams. Exceptional analytical thinking skills for converting data into illuminating reports and reports. BS in computer science or information system along with work experience in a related field knowledge of data warehousing, data gateway, and data preparation projects Working knowledge of Power BI, SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Articulating, representing, and analyzing solutions with the team while documenting, creating, and modeling them Familiarity with the tools and technologies used by the Microsoft SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Knowledge of executing DAX queries on the Power BI desktop Comprehensive understanding of data modeling, administration, and visualization Capacity to perform in an atmosphere where agility and continual development are prioritized Detailed knowledge and understanding of database management systems, OLAP, and the ETL (Extract, Transform, Load) framework Awareness of BI technologies (e.g., Microsoft Power BI, Oracle BI) Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) NOTE: Staffing & Recruitment Companies are advised not to contact us.

Posted 5 days ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking a skilled Business Intelligence Analyst to construct and uphold analytics and reporting solutions that convert data into actionable insights. The BI Analyst role is pivotal, involving the conversion of provided data into meaningful insights through user-friendly dashboards and reports. An ideal BI Analyst possesses proficiency in Business Intelligence tools and technology, overseeing the creation and administration of BI tools with comprehensive knowledge of the BI system. This role demands a grasp of business concepts, strong problem-solving abilities, and prior experience in data and business analysis. Analytical prowess and effective communication skills are highly valued attributes for this position.. The Day-to-day Responsibilities Include But Not Limited To. Recognize business requirements in the context of BI and create data models to transform raw data into relevant insights. Using Power BI, create dashboards and interactive visual reports. Define key performance indicators (KPIs) with specific objectives and track them regularly. Analyze data and display it in reports to aid decision-making. Convert business needs into technical specifications and establish a timeframe for job completion. Create, test, and deploy Power BI scripts, as well as execute efficient deep analysis. Use Power BI to run DAX queries and functions. Create charts and data documentation with explanations of algorithms, parameters, models, and relationships. Construct a data warehouse. Use SQL queries to get the best results. Make technological adjustments to current BI systems to improve their performance. For a better understanding of the data, use filters and visualizations. Transform existing Non-Power BI Reports into dashboards. Experience with custom/ third party visuals. Essential Traits. Minimum level of education required is BA/BS degree in computer science or other relevant educational or work experience; advanced degree is a plus. Background with BI tools and systems especially Power BI. Excellent Knowledge & hands on experience VBA (Visual Basic for Applications), SQL & Advance excel are required. Graduate with 2-5 years’ experience in Power BI, Advance Excel, VBA & SQL. Prior experience in data-related tasks. Understanding of the Microsoft BI Stack. Be familiar with MS SQL Server BI Stack tools and technologies, such as SSRS and TSQL, Power Query, MDX, Power BI, and DAX. Exposure in implementing row-level security and bookmarks.. Analytical thinking for converting data into relevant reports and graphics. Knowledge of Power BI application security layer models. Ability to run DAX queries on Power BI desktop. Proficient in doing advanced-level computations on the data set. Ensure data and insights generated are maintained at high quality standards to meet stakeholder expectations. Active learning and complex problem solving. Excellent communication skills are required to communicate needs with client and internal teams. Proven abilities to take initiative and be innovative. Analytical mind with a problem-solving aptitude. Translate business needs to technical specifications. Open for feedback and learning opportunities. Can work in metric driven system & work independently with onshore as per requirement. Preferred. Microsoft/ Any other BI Certified Data Analyst. About Kroll. In a world of disruption and increasingly complex business challenges, our professionals bring truth into focus with the Kroll Lens. Our sharp analytical skills, paired with the latest technology, allow us to give our clients clarity—not just answers—in all areas of business. We value the diverse backgrounds and perspectives that enable us to think globally. As part of One team, One Kroll, you’ll contribute to a supportive and collaborative work environment that empowers you to excel.. Kroll is the premier global valuation and corporate finance advisor with expertise in complex valuation, disputes and investigations, M&A, restructuring, and compliance and regulatory consulting. Our professionals balance analytical skills, deep market insight and independence to help our clients make sound decisions. As an organization, we think globally—and encourage our people to do the same.. Kroll is committed to equal opportunity and diversity, and recruits people based on merit.. In order to be considered for a position, you must formally apply via careers.kroll.com. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Date 1 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Could you be the full-time JIVS Expert in our IS&T/Processes Solutions Architecture team were looking for Your future role Take on a new challenge and apply your extensive expertise in Azure Blob Storage and JIVS technology in a new cutting-edge field. Youll work alongside innovative, collaborative, and solution-focused teammates. You'll lead the optimization of data management and migration strategies, ensuring seamless transitions of data and maintaining database integrity. Day-to-day, youll work closely with teams across the business (such as Business Stakeholders, IT Infrastructure, and Business Solutions), collaborate with partners relevant to Archiving Projects Delivery, and much more. Youll specifically take care of developing and implementing JIVS solutions, monitoring and maintaining the performance of JIVS applications, and utilizing Azure Blob Storage for efficient data management. Well look to you for: Designing and managing JIVS solutions that align with organizational goals Collaborating with cross-functional teams to analyze system requirements Ensuring the reliability and scalability of JIVS applications Administering and maintaining database systems for high availability and security Executing data migration projects with precision Managing the decommissioning of applications, including data extraction and transfer Creating and maintaining comprehensive documentation All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelor's degree in Computer Science, Information Technology, or related field Experience or understanding of JIVS implementations and management Knowledge of Azure Blob Storage services and best practices Familiarity with scripting languages and tools in JIVS and Azure environments A certification in database technologies or cloud database solutions is a plus Excellent problem-solving skills and collaborative teamwork abilities Strong communication skills, both verbal and written Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges and a long-term career free from boring daily routines Work with new security standards for rail signalling Collaborate with transverse teams and helpful colleagues Contribute to innovative projects that shape the future of mobility Utilise our flexible and dynamic working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development, through award-winning learning opportunities Progress towards leadership and specialized technical roles Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension) You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.

Posted 1 week ago

Apply

2.0 - 3.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Total Experience Expected: 2-3 years Looking for an OneStream consultant with hands-on implementation experience. Skills : Should be proficient in building workflows, dashboards, and cube views. Experience in financial consolidation, planning models, and system integrations is a strong plus. Excellent analytical, problem-solving, and communication skills. Ability to work independently and collaboratively in a fast-paced environment. Communication Skills, Workflow, Analytical & Problem Solving, Cube, Dashboards, Financial Consolidation

Posted 1 week ago

Apply

1.0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

- 1+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) As a Data Engineer, You will be working on building and maintaining complex data pipelines, assemble large and complex datasets to generate business insights and to enable data driven decision making and support the rapidly growing and dynamic business demand for data. You will have an opportunity to collaborate and work with various teams of Business analysts, Managers, Software Dev Engineers, and Data Engineers to determine how best to design, implement and support solutions. You will be challenged and provided with tremendous growth opportunity in a customer facing, fast paced, agile environment. Key job responsibilities * Design, implement and support an analytical data platform solutions for data driven decisions and insights * Design data schema and operate internal data warehouses & SQL/NOSQL database systems * Work on different data model designs, architecture, implementation, discussions and optimizations * Interface with other teams to extract, transform, and load data from a wide variety of data sources using AWS big data technologies like EMR, RedShift, Elastic Search etc. * Work on different AWS technologies such as S3, RedShift, Lambda, Glue, etc.. and Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency * Work on data lake platform and different components in the data lake such as Hadoop, Amazon S3 etc. * Work on SQL technologies on Hadoop such as Spark, Hive, Impala etc.. * Help continually improve ongoing analysis processes, optimizing or simplifying self-service support for customers * Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. * Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. * Enjoy working closely with your peers in a group of talented engineers and gain knowledge. * Be enthusiastic about building deep domain knowledge on various Amazon’s business domains. * Own the development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

SQL Server Analytics & Power BI Developer_Full-Time_Noida, Pune, Mumbai, Delhi, Chennai, Bangalore Job Title: SQL Server Analytics & Power BI Developer Location: Noida, Pune, Mumbai, Delhi, Chennai, Bangalore Job Type: Full-Time Experience: 5-10 Years Skills: SQL Server Analytics Service (SSAS), Basic Power BI, Strong SQL Knowledge Job Description: We are seeking an experienced Senior SQL Server Analytics & Power BI Developer to design and optimize high-performance data models and reporting solutions leveraging SQL Server Analytics Services (SSAS), Power BI, and advanced SQL techniques. The successful candidate will have expertise in multidimensional modeling, complex query optimization, and data visualization, along with a deep understanding of how to architect solutions that integrate with larger data ecosystems. Key Responsibilities: Design and Implement SSAS Cubes: Architect, develop, and optimize OLAP cubes for reporting and analytics using SQL Server Analysis Services (SSAS), focusing on performance tuning, security, and usability. Advanced Data Modeling: Develop complex data models within SSAS, ensuring they are scalable and optimized for reporting efficiency. Employ best practices in dimensional modeling (star and snowflake schemas) to deliver high-performing datasets. Power BI Integration: Design Power BI reports with complex visualizations, integrating SSAS cubes and custom SQL queries for enhanced data insights. Implement advanced DAX calculations and Power Query transformations. ETL and Data Processing: Design, implement, and optimize ETL processes to populate SSAS models from heterogeneous data sources, ensuring data integrity, validation, and consistency. Performance Optimization: Apply advanced query tuning, indexing, partitioning, and caching techniques to optimize both SSAS cubes and SQL queries for fast, responsive reporting. Collaborate with Stakeholders: Work closely with business analysts, architects, and stakeholders to translate complex business requirements into scalable data solutions. Qualifications: Extensive experience with SQL Server (T-SQL, SSIS, SSRS, SSAS). Proficient in building and optimizing SSAS multidimensional models, including cube design, MDX, and query optimization. Expertise in Power BI with advanced DAX, Power Query, and report optimization techniques. Strong understanding of data warehousing concepts and performance tuning techniques. Hands-on experience with large-scale, enterprise-level data architectures. Solid knowledge of advanced SQL query writing and optimization strategies.

Posted 1 week ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description As a Data Engineer, You will be working on building and maintaining complex data pipelines, assemble large and complex datasets to generate business insights and to enable data driven decision making and support the rapidly growing and dynamic business demand for data. You will have an opportunity to collaborate and work with various teams of Business analysts, Managers, Software Dev Engineers, and Data Engineers to determine how best to design, implement and support solutions. You will be challenged and provided with tremendous growth opportunity in a customer facing, fast paced, agile environment. Key job responsibilities Design, implement and support an analytical data platform solutions for data driven decisions and insights Design data schema and operate internal data warehouses & SQL/NOSQL database systems Work on different data model designs, architecture, implementation, discussions and optimizations Interface with other teams to extract, transform, and load data from a wide variety of data sources using AWS big data technologies like EMR, RedShift, Elastic Search etc. Work on different AWS technologies such as S3, RedShift, Lambda, Glue, etc.. and Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency Work on data lake platform and different components in the data lake such as Hadoop, Amazon S3 etc. Work on SQL technologies on Hadoop such as Spark, Hive, Impala etc.. Help continually improve ongoing analysis processes, optimizing or simplifying self-service support for customers Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. Enjoy working closely with your peers in a group of talented engineers and gain knowledge. Be enthusiastic about building deep domain knowledge on various Amazon’s business domains. Own the development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3013333

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for data visualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python ○ OR equivalent experience. Experience in data integration or migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for data visualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python ○ OR equivalent experience. Experience in data integration or migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

4.0 - 7.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Are you passionate about transforming data into actionable insightsJoin our team at Infineon Technologies as a Staff Engineer in Data Engineering & Analytics! In this role, you'll be at the forefront of harnessing the power of data to drive innovation and efficiency Collaborate with experts, design robust data ecosystems, and support digitalization projects If you have a strong background in data engineering, database concepts, and a flair for turning complex business needs into solutions, we want to hear from you Elevate your career with us and be part of shaping the future! Job Description In your new role you will: Identify and understand the different needs and requirements of consumers and data providers (e-g transaction processing, data ware housing, big data, AI/ML) and translate business digitalization needs to technical system requirements, Team up with our domain-, ITand process experts to assess the status quo, to capture the full value of our data and to derive target data-ecosystems based on business needs Design, build, deploy and maintain scalable and reliable data assets, pipelines and architectures, Team-up with domain ITand process experts and especially with you key users to validate the effectiveness and efficiency of the designed data solutions and contribute to their continuous improvement and to their future-proofing, Support data governance (Data Catalogue, Data Lineage, Meta Data, Data Quality, Roles and Responsibilities) and enable analytics use cases with a focus on data harmonization, connection and visualization, Drive and/or contribute to digitalization projects in cross-functional coordination with IT and business counterparts (e-g data scientists, domain experts, process owners), Act as first point of contact for data solutions in the ATV QM organization to consult and guide stakeholders to leverage the full value from data and to cascade knowledge of industry trends and technology roadmaps for the major market players (guidelines,principles, frameworks, industry standards and best practice, upcoming innovation, new features and technologies) Your Profile You are best equipped for this task if you have: A degree in Information Technology, Business Informatics, Computer Science or related field of studies, At least 5 years of relevant work experience related to Data Engineering and/or Analytics with strong data engineering focus Ability to translate complex business needs into concrete actions Excellent expertise of database concepts (e-g DWH, Hadoop/Big Data, OLAP), related query languages (e-g SQL, Scala, Java, MDX) Expertise in data virtualization (e-g Denodo) Working knowledge on the latest toolsets for data analytics, reporting and data visualization (e-g Tableau, SAP BO) as well as in Python, R and Spark is a plus Ability to work both independently and within a team #WeAreIn for driving decarbonization and digitalization, As a global leader in semiconductor solutions in power systems and IoT, Infineon enables game-changing solutions for green and efficient energy, clean and safe mobility, as well as smart and secure IoT Together, we drive innovation and customer success, while caring for our people and empowering them to reach ambitious goals Be a part of making life easier, safer and greener, Are you in We are on a journey to create the best Infineon for everyone, This means we embrace diversity and inclusion and welcome everyone for who they are At Infineon, we offer a working environment characterized by trust, openness, respect and tolerance and are committed to give all applicants and employees equal opportunities We base our recruiting decisions on the applicant?s experience and skills, Please let your recruiter know if they need to pay special attention to something in order to enable your participation in the interview process, Click here for more information about Diversity & Inclusion at Infineon,

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Indeed logo

Senior Software Engineer Bangalore, Karnataka, India + 1 more location Date posted Jun 19, 2025 Job number 1830832 Work site Up to 50% work from home Travel None Role type Individual Contributor Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Qualifications Required /Minimum Qualifications • Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Experience in data integration or migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred/Additional Qualifications • BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Responsibilities • Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for data visualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies