Home
Jobs

6 Bi Solutions Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

3 - 5 Lacs

Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

Dear Candidates, We are pleased to announce a walk-in interview for the role of Business Intelligence Analyst - Goregaon West at Rentokil PCI , a leading organization committed to delivering excellence. This is a great opportunity to join a dynamic team and grow your career in a fast-paced, technology-driven environment. Walk-in Interview Details: Position: Business Intelligence Analyst Experience Required: 2+ to 4 years of proven experience in a Business Intelligence Analyst role. Date: Monday 16th June 2025 Time: 11:00 AM to 2:00 PM Venue: Rentokil PCI, Pest Control Pvt. Ltd.3 Floor,'Narayani, Ambabai Temple, Compound, Aarey Rd, near Bank of Maharashtra, Goregaon West, Mumbai, Maharashtra 400062 Important Information: Candidates with strong English communication skills will be preferred, especially those currently based in Western line of Mumbai . A minimum of 2+ to 4 years of experience as a Business Intelligence Analyst is required. We are looking for immediate joiners or those with a short notice period . Please carry your updated resume and attend the interview in formal attire . About the Role: The Business Intelligence is responsible for working within the BI team to deliver reporting and dashboard solutions that meet the needs of the organisation. The developer must work well in a team setting and have excellent organisational, prioritisation, communication, and time management skills. The successful candidate will demonstrate accountability, flexibility and adaptability to handle multiple and changing priorities and be able to successfully collaborate with development teams, technology groups, consultants, and key stakeholders. The person will report to the Manager - Application Support. The incumbent will have to work as part of a multi-functional team and this involves collaboration with the internal team and external stakeholders. Job Responsibilities: Develop and manage BI solutions Analyse business processes and requirements Create and maintain documentation including requirements, design and user manuals Conduct unit testing and troubleshooting Develop and execute database queries and conduct analyses Identify development needs in order to improve and streamline operations Identify opportunities to improve processes and strategies with technology solutions Key Result Areas: Ensure quality and accuracy of data assets and analytic deliverables Troubleshooting business intelligence modelling issues and developing solutions within the timelines Query resolution Enhancing application knowledge to implement new solutions On time deployment of different projects as per the business requirements On time creation and analysis of visualisations and reports Competencies (Skills essential to the role): Strong analytical skills Ability to solve practical problems and deal with a variety of concrete variables in situations where only limited standardisation exists Ability to think logically and troubleshoot issues Excellent interpersonal (verbal and written) communication skills are required to support working in project environments that includes internal, external and customer teams Requirements Educational Qualification / Other Requirement: Graduate Degree in Computer Science, Information Technology 1 to 2 years Experience working on BI platform, DataStudio, any Cloud Platform, Qlik Strong SQL Developer skills. Strong SQL development skills with in-depth knowledge of complex SQL queries and good understanding of QlikSense. Good working knowledge of SSIS, SSRS, SSAS and proper workflow design and exception management Experience in Data Warehouse, ETL, Cube and Report design and development Role Type / Key working relationships: Individual Contributor Internal team External stakeholders Benefits Attractive Base Salary Annual Performance Based Bonus Group Mediclaim Insurance Policy Travel Reimbursement Equal Opportunities What can you expect from RPCI? Our values lie at the core of our mission and vision. We believe that its our people who make our company what it is. We believe in: Safety Integrity Innovation Learning & Development Open & Transparent Performance Orientation Why Join Rentokil PCI? Rentokil PCI is a recognized leader in the pest control and hygiene industry, committed to delivering excellence and ensuring customer satisfaction. By joining our team, you will have the opportunity to advance your career in a dynamic, fast-paced environment, with continuous learning and development at the forefront of our culture. If you meet the requirements and are interested, we would be delighted to meet you at the walk-in interview. For any questions or further details, please feel free to contact us at: Contact Person: Hitesha Patel Contact Number : 8828018709 Email ID : hiteshav.patel@rentokil-pci.com We look forward to meeting you at the interview! Visit us: www.rentokil-pestcontrolindia.com

Posted 2 weeks ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Development of BI Solutions delivered through Power BI and SSAS Tabular Education and training of internal users on BI Solutions Technical user support and updating user documentation. Participant in the data modelling process for BI deliverables Participant in the delivery of a new BI self-service strategy and roll out to the different global functions. Leading part in design and development of new BI solutions primarily using Power BI and SSAS Tabular Evaluation and improvement of existing BI solutions and applications Technical Implementation of BI Solution within assigned projects Sizing of Work Items within assigned projects Additional responsibilities as assigned.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do The R&D Precision Medicine team is responsible for Data Standardization, Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with access to Amgens wealth of human datasets, projects and study histories, and knowledge over various scientific findings. These data include clinical data, omics, and images. These solutions are pivotal tools in Amgens goal to accelerate the speed of discovery, and speed to market of advanced precision medications. The Data Engineer will be responsible for full stack development of enterprise analytics and data mastering solutions leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that support research cohort-building and advanced AI pipelines. The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions, and be exceptionally skilled with data analysis and profiling. You will collaborate closely with partners, product team members, and related IT teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a solid background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data management tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql, along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models, and processing layers, that support both analytical processing and operational reporting needs. Design and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with partners to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Masters degree with 1 to 3 years of experience in Data Engineering OR Bachelors degree with 1 to 3 years of experience in Data Engineering Must-Have Skills: Minimum of 1 year of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 1 year of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Experience using cloud platforms (AWS), data lakes, and data warehouses. Working knowledge of ETL processes, data pipelines, and integration technologies. Good communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional hands-on capabilities with data profiling and data anlysis Good-to-Have Skills: Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications: ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft CertifiedData Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity Highest degree of initiative and self-motivation Strong verbal and written communication skills, including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams, specifically including using of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 2 weeks ago

Apply

10.0 - 14.0 years

35 - 40 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Naukri logo

Experience - 10 to 14 Years Experience Work From Office All Days Presales Experience Required in Data Engineeting, Data Analytics, BI Domain Job Location - Hyderabad, Indore, Ahmedabad

Posted 3 weeks ago

Apply

10.0 - 14.0 years

10 - 16 Lacs

Pune

Work from Office

Naukri logo

Role Overview:- The Senior Tech Lead - GCP Data Engineering leads the design, development, and optimization of advanced data solutions. The jobholder has extensive experience with GCP services, data architecture, and team leadership, with a proven ability to deliver scalable and secure data systems. Responsibilities:- Lead the design and implementation of GCP-based data architectures and pipelines. Architect and optimize data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and ensure alignment with business goals. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in GCP data environments. Stay updated on the latest GCP technologies and industry trends. Key Technical Skills & Responsibilities Overall 10+ Yrs of experience with GCP and Data Warehousing concepts; Coding; reviewing; testing and debugging Experience as architect on GCP implementation/or migration data projects. Must have understanding of Data Lakes and Data Lake Architectures, best practices in data storage, loading, retrieving data from data lakes. Experience in develop and maintain pipelines in GCP platform, understand best practices of bringing on-prem data to the cloud. File loading, compression, parallelization of loads, optimization etc. Working knowledge and/or experience with Google Data Studio, looker and other visualization tools Working knowledge in Hadoop and Python/Java would be an added advantage Experience in designing and planning BI solutions, Debugging, monitoring and troubleshooting BI solutions, Creating and deploying reports and Writing relational and multidimensional database queries. Any experience in NOSQL environment is a plus. Must be good with Python and PySpark for data pipeline building. Must have experience of working with streaming data sources and Kafka. GCP Services - Cloud Storage, BigQuery , Big Table, Cloud Spanner, Cloud SQL, DataStore/Firestore, DataFlow, DataProc, DataFusion, DataPrep, Pub/Sub, Data Studio, Looker, Data Catalog, Cloud Composer, Cloud Scheduler, Cloud Function Eligibility Criteria: Bachelors degree in Computer Science, Data Engineering, or a related field. Extensive experience with GCP data services and tools. GCP certification (e.g., Professional Data Engineer, Professional Cloud Architect). Experience with machine learning and AI integration in GCP environments. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Proven leadership experience in managing technical teams. Excellent problem-solving and communication skills.

Posted 1 month ago

Apply

15.0 - 20.0 years

30 - 40 Lacs

Pune, Bengaluru, Vadodara

Work from Office

Naukri logo

Overview : We're seeking an exceptional Power BI Solutions Architect to lead client engagements and design enterprise-scale BI solutions. You'll translate business requirements into technical architectures while guiding a team of Power BI developers. This role requires deep Microsoft data platform expertise combined with outstanding client communication skills, with particular emphasis on embedded analytics and role-based access control (RBAC). Key Responsibilities: Lead client discovery sessions and translate business needs into technical solutions Design scalable, secure, and high-performance Power BI architectures Mentor and direct a team of Power BI developers Implement complex features including Power BI Embedded, row-level security, and enterprise deployments Architect data models and optimize DAX for maximum performance Design integrations between Power BI and diverse enterprise data sources Create migration strategies from legacy BI to modern cloud architectures Required Skills & Experience Leadership & Consulting Proven experience leading client-facing technical engagements Ability to communicate complex concepts to diverse stakeholders Experience managing technical teams on complex BI implementations Power BI Expertise (4+ years) Power BI Embedded : Advanced experience with multi-tenant implementations, app ownership, and JavaScript API integration Security Architecture : Implementing row-level security (RLS), object-level security, and Azure AD integration Advanced DAX : Complex calculation development and performance optimization Power BI Premium : Capacity management, large dataset handling, and deployment pipelines Role & responsibilities Fabric Integration : Experience with Microsoft Fabric components including Data Factory, Dataflows, and Synapse integration Composite Models : Implementation of mixed storage mode solutions for optimal performance Paginated Reports : Design and deployment within Power BI service Data Platform Skills : SQL Server : Performance tuning, high availability configurations, and clustering Cloud Services : Azure Synapse, Azure SQL DB, and Azure Data Factory implementation Data Modeling : Dimensional modeling for analytics workloads Technical Proficiencies Preferred candidate profile Required 1.Power BI Embedded application design and implementation 2.Complex DAX formula development with performance optimization 3.Row-level security (RLS) across organizational hierarchies 4.Reporting database setup and optimization for analytics workloads Highly Desirable 1.Microsoft Fabric experience 2.DirectQuery and Composite Model optimization 3.Power BI REST API integration 4.AI visual and metrics implementation What Sets You Apart Experience implementing Copilot for Power BI with advanced natural language query capabilities Experience with Power BI Premium capacity planning, autoscaling, and resource optimization for enterprise workloads

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies