Home
Jobs

681 Dashboarding Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Key Responsibilities: Instrument Angular frontend and Java backend applications with GIL for effective logging and analytics. Design and implement client-side and server-side tracking mechanisms to capture key user and system activities. Aggregate, store, and process instrumentation data efficiently for reporting and analytics. Develop dashboards to summarize usage metrics, engagement patterns, and system health using modern visualization frameworks/tools. Create tracking for usage of different data sources (e.g., APIs, databases) and present metrics to business and technical stakeholders. Collaborate closely with product managers, UX designers, backend engineers, and data engineers to identify meaningful metrics and optimize tracking strategies. Required Skills and Experience: FrontendStrong experience with Angular & TypeScript BackendSolid experience with Java (Spring Boot preferred), REST APIs. Familiar with SQL and basic dashboarding to generate report Experience with instrumentation approaches. Understanding of data ingestion pipelines, event tracking, and aggregation techniques. Strong problem-solving skills and ability to work independently and collaboratively. Excellent verbal and written communication skills

Posted 10 hours ago

Apply

10.0 years

10 - 14 Lacs

Hyderābād

On-site

GlassDoor logo

Lead Data Engineer Experience: 10+ years Location: Hyderabad Type: Hybrid (Contractual) Contract Duration: 6+ months Job Summary: We are seeking an experienced and results-oriented Lead Data Engineer to drive the design, development, and optimization of enterprise data solutions. This on-site role requires in-depth expertise in FiveTran, Snowflake, SQL, Python, and data modeling, as well as a demonstrated ability to lead teams and mentor both Data Engineers and BI Engineers. The role will play a critical part in shaping the data architecture, improving analytics readiness, and enabling self-service business intelligence through scalable star schema designs. Key Responsibilities: * Lead end-to-end data engineering efforts, including architecture, ingestion, transformation, and delivery. * Architect and implement FiveTran-based ingestion pipelines and Snowflake data models. * Create optimized Star Schemas to support analytics, self-service BI, and KPI reporting. * Analyze and interpret existing report documentation and KPIs to guide modeling and transformation strategies. * Design and implement efficient, scalable data workflows using SQL and Python. * Review and extend existing reusable data engineering templates and frameworks. * Provide technical leadership and mentorship to Data Engineers and BI Engineers, ensuring best practices in coding, modeling, performance tuning, and documentation. * Collaborate with business stakeholders to gather requirements and translate them into scalable data solutions. * Work closely with BI teams to enable robust reporting and dashboarding capabilities. Required Skills: * 7+ years of hands-on data engineering experience, with 2+ years in a technical leadership or lead role. * Deep expertise in FiveTran, Snowflake, and SQL development. * Proficiency in Python for data transformation and orchestration. * Strong understanding of data warehousing principles, including star schema design and dimensional modeling. * Experience in analyzing business KPIs and reports to influence data model design. * Demonstrated ability to mentor both Data Engineers and BI Engineers and provide architectural guidance. * Excellent problem-solving, communication, and stakeholder management skills. Job Type: Contractual / Temporary Contract length: 6 - 12 months Pay: ₹90,000.00 - ₹120,000.00 per month Schedule: Day shift Experience: Python: 8 years (Required) snowflake: 8 years (Required) ETL: 8 years (Required) Work Location: In person

Posted 11 hours ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Amazon’s Spectrum Analytics team is looking for a Business Intelligence Engineer to help build the next generation of analytics solutions for Selling Partner Developer Services. This is an opportunity to get in on the ground floor as we transform from a reactive, request-directed team to a proactive, roadmap-driven organization that accelerates the business. We need someone who is passionate about data and the insights that large amounts of data can provide. In addition to broad experience with data technologies from ingestion to visualization and consumption (e.g. data pipelines, ETL, reporting and dashboarding), the ideal candidate will have strong analysis skills and an insatiable curiosity to answer the question "why?". You will also be able to articulate the story the data is telling with compelling verbal and written communication. Key job responsibilities Deliver minimally to moderately complex data analysis; collaborating as needed with Data Science as complexity increases. Development of dashboards and reports. Development of minimally to moderately complex data processing jobs using appropriate technologies (e.g. SQL, Python, Spark, AWS Lambda, etc.), collaborating with Data Engineers as needed. Collaborate with stakeholders to understand business domains, requirements, and expectations. Additionally, working with owners of data source systems to understand capabilities and limitations. Manage the deliverables of projects, anticipate risks and resolve issues. Adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. About The Team Spectrum offers a world-class suite of data products and experiences to empower the creation of innovative solutions on behalf of Partners. Our foundational systems and tools solve for cross-cutting Builder needs in externalizing data, and are easily extensible using federated policy and reusable technology. Basic Qualifications Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling 5+ years of relevant professional experience in business intelligence, analytics, statistics, data engineering, data science or related field. Demonstrated data analysis and visualization skills. Highly proficient with SQL. Knowledge of AWS products such as Redshift, Quicksight, and Lambda. Excellent verbal/written communication & data presentation skills; ability to succinctly summarize key findings and effectively communicate with both business and technical teams. Preferred Qualifications Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A3001103

Posted 11 hours ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Urban Company: Urban Company is a technology platform offering customers a variety of services at home. Customers use our platform to book services such as beauty treatments, haircuts, massage therapy, cleaning, plumbing, carpentry, appliance repair, painting, and more—all delivered in the comfort of their home and at a time of their choosing. We promise our customers a high-quality, standardized, and reliable service experience. To fulfill this promise, we work closely with our hand-picked service partners, enabling them with technology, training, products, tools, financing, insurance, and brand support to help them succeed. Urban Company started out as UrbanClap in November 2014, when its founders realized that the home services industry was largely unorganized, fragmented, and offline. Customers found it difficult to access quality services in a convenient manner, and service professionals struggled due to reliance on multiple middlemen. The founders launched Urban Company to disrupt the industry with a focus on three key principles: Customer Love : Build a platform that offers truly delightful and differentiated services. Partner Empowerment : Build a deep, full-stack partnership with service partners to improve their earnings and livelihoods. Technology First : Bring innovation and technology to an age-old industry. About the Content Ops team: The team manages end-to-end in-app content operations — from launching new categories and revamping existing ones to driving timely content updates across geographies. As part of the core Product Design team, we also look after content deployment, testing category flows, and ensuring a seamless user experience. The team closely works with business, marketing, content, and design teams, gaining hands-on experience in a fast-paced, collaborative, and supportive environment. We are looking for someone with: A strong quality mindset: someone who sets a high bar and is only satisfied with flawless content execution. Stakeholder management skills: the ability to collaborate with cross-functional teams across the organization. Sharp attention to detail: an eye for spotting inconsistencies/discrepancies across creatives, content, and configurations. Dashboarding proficiency: comfort with using and adapting to various content management dashboards, trackers, and tools. Focus and agility: the ability to multitask effectively and thrive in a fast-paced, dynamic environment. Responsibilities: Ensure timely and efficient execution of in-app content changes across geographies Conduct thorough post-deployment quality checks to maintain accuracy and consistency across creatives, copy, and configurations Own end-to-end deployment of content revamps and new category launches Collaborate with cross-functional stakeholders to gather inputs, align on timelines, and resolve content-related dependencies Share regular progress updates with stakeholders and maintain up-to-date status tracking on JIRA Your role as a core team member: Be part of Urban Company’s core design team, where your work will help shape the user experience journey on the app Demonstrate intent: Deliver high-quality work, efficiently and effectively Uphold integrity: Commit to deadlines, raise concerns early, and take full ownership of your deliverables Show commitment: Ensure top-quality craftsmanship and adherence to timelines, pushing the boundaries of excellence

Posted 12 hours ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

MetAntz is recruiting a Senior Data Engineer for a leading AI SaaS firm for the role below. It is a remote position. About the Company Leading AI-powered platform serving hundreds of enterprise clients High-growth SaaS company Processes petabytes of data daily to deliver predictive insights and conversational intelligence Trusted by Fortune 500 companies and multi-billion dollar software organizations Position Overview We are seeking a skilled Data Engineer to join our analytics team and help scale our data infrastructure that handles petabytes of information daily. You'll play a crucial role in building faster, more reliable tools and platforms that enable low-latency, horizontally scalable data solutions for our growing user base. Key Responsibilities Data Pipeline Development : Design and develop real-time events pipelines for data ingestion supporting real-time dashboarding and analytics Platform Innovation : Brainstorm and create new platforms that deliver data to users in various formats with low latency and horizontal scalability Data Transformation : Develop complex and efficient functions to transform raw data sources into powerful, reliable components of our data lake System Optimization : Make changes across the entire technical stack and diagnose problems to ensure optimal performance Technology Implementation : Design & implement new components using various emerging technologies in the Hadoop ecosystem Project Execution : Successfully execute various data engineering projects from conception to completion Required Qualifications Experience : 3+ years of strong hands-on experience with Spark (preferably PySpark) Programming Skills : Proficiency in Python and scripting languages (Python, Bash, etc.) Database Knowledge : Good exposure to SQL, MongoDB, and other database technologies Cloud Technologies : Experience with AWS and cloud platforms such as S3 Real-time Systems : Experience in building real-time data pipelines and scalable systems Industry Background : SaaS or high-growth startup experience preferred Availability : Immediate joiners preferred Technical Environment Work with petabyte-scale data processing systems Hadoop ecosystem and emerging big data technologies Cloud-native architecture with AWS services Real-time streaming and batch processing pipelines Modern data lake architecture and analytics platforms What We Offer Opportunity to work with cutting-edge AI and data science technologies High-impact role in a rapidly scaling revenue intelligence platform Collaborative environment with experienced data engineering teams Competitive compensation and equity package Flexible work arrangements and comprehensive benefits  Ideal Candidate Profile Thrives in fast-paced, high-growth startup environments Passionate about building scalable data infrastructure Experience with large-scale data processing and real-time analytics Strong problem-solving skills and ability to work across the full technical stack Excited about contributing to AI-powered solutions that drive revenue growth This role offers the opportunity to build foundational data infrastructure that directly powers AI insights for enterprise revenue teams worldwide.

Posted 14 hours ago

Apply

2.0 - 5.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Us: Nation with NaMo is India's leading political consulting organization, dedicated to providing high quality professional support for political campaigns. In this capacity, we have been fortunate to be a part of the momentous election campaigns of 2014, 2019 ,2024 and various other state elections. Our work includes envisioning and executing innovative electioneering campaigns, facilitating capacity building of grassroots cadre and shaping governance. We add professional aspects to the strengths of the scores of grassroots workers and ensure optimal electoral results. In today's world, social media plays an important role in shaping the thinking of all individuals. It is a crucial lever in both politics and governance. We are looking for a Data Analyst who is dynamic, enthusiastic and keen-eyed enough to go beyond the obvious, draw complex insights and provide actionables to the team and the client. Primary Responsibilities : - Understand important metrics and devise processes and methodologies to perform benchmark and competitor analysis. - Develop algorithms for automating social media analysis tasks, perform statistical analysis of results and refine models. - Take end to end ownership of the projects at various stages from data extraction, data cleaning, exploratory data analysis, dashboarding and reporting. - Maintain clear and coherent communication, both verbal and written, to understand data needs and report results. - Create efficient codes and processes to create easy to read/visually representative dashboards and automate report generation. - Figure out the errors/bugs in the processes and dashboards and rectify the same making the existing processes robust. - Create efficient tools/scripts for for data pull, storage and management - Understand the requirements of the business in terms of digital advertising, political relevance and social media performance measurement. - Work with other departments to ensure goal and strategy alignment. Qualifications : - Degree in Computer Science, Mathematics, Statistics, Economics or a related discipline from a reputed institute. - Aptitude and curiosity for solving problems with the ability to break down the problem into smaller parts. - Penchant for business, curiosity about numbers and persistence to work with data to tease out insights. - A self-starter mentality, a strong sense of ownership, and an appetite for learning. - Ability to work in a fast-paced environment, prioritize tasks and deliver results with high accuracy. - Organizational skills with an ability to stay focused on achieving time-bound results by coordinating with internal and external stakeholders. - Experience of 2-5 years in the field of data analysis, statistical modeling or digital advertising is a plus. Must have capabilities: - Good working knowledge of Python/SQL and MS Excel/Google Spreadsheet. Ability to handle very large, unstructured and multi-dimensional data sets. - Knowledge of data representation tools such as Power BI, Tableau, Google Data Studio is a plus.

Posted 18 hours ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that is bold, industrious and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com What if we told you that you can move to an exciting role in an entrepreneurial organization without the usual risks associated with it? We understand that you are looking for growth and variety in your career at this point and we would love for you to join us in our journey and grow with us. At Indegene, our roles come with the excitement you require at this stage of your career with the reliability you seek. We hire the best and trust them from day 1 to deliver global impact, handle teams and be responsible for the outcomes while our leaders support and mentor you. We are a profitable rapidly growing global organization and are scouting for the best talent for this phase of growth. With us, you are at the intersection of two of the most exciting industries of healthcare and technology. We offer global opportunities with fast-track careers while you work with a team that is fueled by purpose. The combination of these will lead to a truly differentiated experience for you. If this excites you, then apply below. Role: Senior Analyst - Data Science Descriptions: We are looking for a results-driven and hands-on Lead Data Scientist / Analyst with 5–6 years of experience to lead analytical solutioning and model development in the pharmaceutical commercial analytics domain. The ideal candidate will play a central role in designing and deploying Decision Engine frameworks, implementing advanced analytics solutions, and mentoring junior team members. Key Responsibilities: • Partner with cross-functional teams and client stakeholders to gather business requirements and translate them into robust ML/analytical solutions. • Design and implement Decision Engine workflows to support Next Best Action (NBA) recommendations in omnichannel engagement strategies. • Analyze large and complex datasets across sources like APLD, sales, CRM, call plans, market share, patient claims, and segmentation data. • Perform ad hoc and deep-dive analyses to address critical business questions across commercial and medical teams. • Develop, validate, and maintain predictive models for use cases such as patient journey analytics, HCP targeting, sales forecasting, risk scoring, and marketing mix modeling. • Implement MLOps pipelines using Dataiku, Git, and AWS services to support scalable and repeatable deployment of analytics models. • Ensure data quality through systematic QC checks, test case creation, and validation frameworks. • Lead and mentor junior analysts and data scientists in coding best practices, feature engineering, model interpretability, and cloud-based workflows. • Stay up to date with industry trends, regulatory compliance, and emerging data science techniques relevant to life sciences analytics. • 5+ years of hands-on experience in pharmaceutical commercial analytics, with exposure to cross-functional brand analytics, omnichannel measurement, and ML modeling. • At least 3 years of experience developing and deploying predictive models and ML pipelines in real-world settings. • Proven experience with data platforms such as Snowflake, Dataiku, AWS, and proficiency in PySpark, Python, and SQL. • Experience with MLOps practices, including version control, model monitoring, and automation. • Strong understanding of pharmaceutical data assets (e.g., APLD, DDD, NBRx, TRx, specialty pharmacy, CRM, digital engagement). • Proficiency in ML algorithms (e.g., XGBoost, Random Forest, SVM, Logistic Regression, Neural Networks, NLP). • Experience in key use cases: Next Best Action, Recommendation Engines, Attribution Models, Segmentation, Marketing ROI, Collaborative Filtering. • Hands-on expertise in building explainable ML models and using tools for model monitoring and retraining. • Familiarity with dashboarding tools like Tableau or PowerBI is a plus. • Strong communication and documentation skills to effectively convey findings to both technical and non-technical audiences. • Ability to work in a dynamic, fast-paced environment and deliver results under tight timelines. Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidate’s merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics.

Posted 19 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Data Scientist with Statistics Location: Chennai Work Type: Onsite Position Description: We are seeking an experienced and highly analytical Senior Data Scientist with a strong statistical background to join our dynamic team. You will be instrumental in leveraging our rich datasets to uncover insights, build sophisticated predictive models, and create impactful visualizations that drive strategic decisions. Responsibilities: Lead the end-to-end lifecycle of data science projects, from defining the business problem and exploring data to developing, validating, deploying, and monitoring models in production. Apply advanced statistical methodologies and machine learning algorithms to analyze large, complex datasets (structured and unstructured) and extract meaningful patterns and insights. Develop and implement robust, scalable, and automated processes for data analysis and model pipelines, leveraging cloud infrastructure. Collaborate closely with business stakeholders and cross-functional teams to understand their analytical needs, translate them into technical requirements, and effectively communicate findings. Create compelling and interactive dashboards and data visualizations to clearly present complex results and insights to both technical and non-technical audiences. Stay up to date with the latest advancements in statistics, machine learning, and cloud technologies, and advocate for the adoption of best practices. Skills Required: Statistics, Machine Learning, Data Science, Problem Solving, Analytical, Communications Skills Preferred: GCP, Google Cloud Platform, Mechanical Engineering, Cost Analysis Experience Required: 5+ years of progressive professional experience in a Data Scientist, Machine Learning Engineer, or similar quantitative role, with a track record of successfully delivering data science projects. Bachelor's or Master's degree in Statistics. A strong foundation in statistical theory and application is essential for this role. (We might consider highly related quantitative fields like Applied Statistics, Econometrics, or Mathematical Statistics if they have a demonstrably strong statistical core, but Statistics is our primary focus). Proven hands-on experience applying a variety of machine learning techniques (e.g., regression, classification, clustering, tree-based models, potentially deep learning) to real-world business problems. Must have strong proficiency in Python and its data science ecosystem (e.g., Pandas, NumPy, Scikit-learn, potentially TensorFlow or PyTorch). Hands-on experience working with cloud computing platforms (e.g., AWS, Azure, GCP) for data storage, processing, and deploying analytical solutions. Extensive experience creating data visualizations and dashboards to effectively communicate insights. You know how to tell a story with data! Solid understanding of experimental design, hypothesis testing, and statistical inference. Excellent problem-solving skills, attention to detail, and the ability to work with complex data structures. Strong communication, presentation, and interpersonal skills, with the ability to explain technical concepts clearly to diverse audiences. Experience Preferred: Experience working within the Automotive industry or with related data such as vehicle telematics, manufacturing quality, supply chain, or customer behavior in an automotive context. Experience with GCP services such as GCP Big query, GCS, Cloud Run, Cloud Build, Cloud Source Repositories, Cloud Workflows Proficiency with specific dashboarding and visualization tools such as Looker Studio, PowerBI, Qlik, or Tableau. Experience with SQL for data querying and manipulation. Familiarity with big data technologies (e.g., Spark, Hadoop). Experience with MLOps practices and tools for deploying and managing models in production. Advanced degree (PhD) in Statistics or a related quantitative field. Education Required: Bachelor's Degree Education Preferred: Master's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 1 day ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position : Program Associate – Innovation Management Reporting Manager : Director Innovation Management Who We Are: T-Hub is India’s leading innovation ecosystem, enabling and empowering an innovation-led economy. As a catalyst for innovation, T-Hub has supported over 3,000 startups and facilitated 600+ corporate innovation engagements across diverse sectors including deep tech, sustainability, mobility, health, and more. Our mission is to accelerate innovation for startups, corporates, governments, and academia through high-impact programs, partnerships, and ecosystem collaborations. With a strong focus on entrepreneurship, T-Hub has emerged as a preferred innovation partner for leading public and private sector organizations. What will you do? As a Program Associate – Innovation , you will be responsible for end-to-end execution of innovation-focused programs, ensuring seamless coordination with internal and external stakeholders while managing reporting, feedback, and process improvement. Support day-to-day operations and execution of innovation engagements (Example: mentorship, workshops, office hours, post-program support) Engage effectively with internal and external stakeholders to ensure smooth program delivery Maintain program tracking via project management tools (Zoho Projects / Asana / similar) Create and manage program reports, dashboards, communication templates, and knowledge assets Collect, analyze, and act upon feedback to drive continuous program improvement Expected Deliverables: Effective execution of programs (startups/corporates/academia/government) with no major escalations Consistent tracking, reporting, and improvement of program KPIs and stakeholder engagement metrics Coordinate with multiple stakeholders to facilitate high impact corporate events Requirements: A postgraduate degree with a specialization in Innovation, Entrepreneurship, or a related discipline preferred. 2-4 years of relevant experience in program coordination or similar roles, preferably in startup ecosystems, incubators, or corporate innovation teams. Proficiency in Microsoft Office and familiarity with project management tools such as Zoho Projects, Jira, or Asana. Experience in managing stakeholder relationships and driving structured program operations.  Desired Skills & Competencies: Entrepreneurial mindset and ability to take initiative. Strong sense of ownership and accountability. Adaptability and learning agility in a dynamic environment. Effective collaboration and teamwork with cross-functional stakeholders. Understanding of innovation ecosystems, startup dynamics, and incubation processes. Clear documentation, reporting, and dashboarding capabilities. Strong presentation and analytical communication skills.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

We're Hiring: SAP Analytics Cloud (SAC) Lead Location: Gurgaon / Delhi NCR Experience: 5–8 Years CTC: Up to ₹25 LPA Apply at: jennifer.thomas@sgnsoftware.com Subject line: SAP (SAC) About the Role SGN Software is expanding its advanced analytics practice under the RISE with SAP transformation programs. We’re looking for a highly skilled SAP Analytics Cloud (SAC) Lead to shape and execute analytics strategies for top-tier clients across industries. In this role, you’ll design and lead SAC implementations, drive business value through intelligent dashboards, and collaborate with CXOs and decision-makers across finance, operations, and strategy teams. Key Responsibilities SAC Implementation Leadership Lead full lifecycle SAC implementations for RISE with SAP clients. Dashboarding & Storytelling Build interactive CXO dashboards , KPI scorecards, and self-service BI. SAC Planning Deploy planning models for P&L , sales , workforce planning , etc. Data Modeling Build scalable SAC data models via live/import connections. Stakeholder Collaboration Engage business and IT leaders to align on KPIs and data strategies. System Integration Connect SAC with SAP S/4HANA , BW/4HANA , and non-SAP systems. Governance & Standards Define SAC best practices for modeling, security, and UX consistency. Must-Have Skills ✅ 5–8 years hands-on SAC experience (reporting + planning) ✅ Strong grip on CXO dashboarding and data storytelling ✅ Experience with RISE with SAP environments ✅ Proficiency in data modeling, integration, and analytics governance ✅ Excellent communication and stakeholder engagement skills ✅ Familiarity with SAP BW, S/4HANA, hybrid data connectivity Good to Have SAP SAC Certification (Planning and/or Reporting) Knowledge of Predictive Scenarios, Advanced Formulas Experience in multi-country/global delivery projects Why Join SGN Software? SAP Growth Champion Partner of the Year – India 2025 One of SAP’s fastest-growing partners in India 250+ SAP consultants delivering across industries Experts in SAC, Data Archiving, OpenText, BTP, and AI Career growth into Analytics COE or Product Advisory roles Work with visionary clients and global SAP transformation projects Interested? Send your resume to jennifer.thomas@sgnsoftware.com Subject Line: SAP (SAC) Let's shape the future of enterprise analytics — together. #Hiring #SAPJobs #SAC #AnalyticsCloud #S4HANA #RISEwithSAP #Dashboards #SAPCareers #DelhiNCRJobs #SGNSoftware

Posted 1 day ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are looking for an experienced Data Analyst with at least 6 years of professional expertise in dashboarding, data visualization, and analytics. The ideal candidate will be highly skilled in tools like Power BI, Tableau, SQL, and Excel and have hands-on experience in ETL processes. You will play a key role in transforming raw data into actionable insights and building compelling dashboards to support business decision-making. Job Description: Key Responsibilities: Dashboard Development: Design, develop, and maintain interactive and visually appealing dashboards using Power BI and Tableau. Collaborate with stakeholders to gather requirements and ensure dashboards meet business needs. Data Analytics & Visualization: Analyze large datasets to identify trends, patterns, and insights. Create meaningful data visualizations and reports that communicate insights effectively to business users. ETL and Data Management: Perform data extraction, transformation, and loading (ETL) to consolidate and structure data for analysis. Ensure data accuracy and integrity through validation and regular audits. Reporting: Develop custom reports using SQL and Excel to address specific business questions. Automate recurring reports to improve efficiency and accuracy. Collaboration and Communication: Work closely with cross-functional teams, including business units, IT, and data engineers, to align data solutions with business objectives. Present findings and insights to stakeholders in a clear and concise manner. Continuous Improvement: Identify opportunities for process automation and optimization. Stay up to date with the latest tools, technologies, and best practices in data analytics and visualization. Required Skills and Qualifications: Experience: Minimum 6 years of experience in data analytics, dashboarding, and reporting. Tools: Expertise in Power BI, Tableau, SQL, and Excel. ETL Knowledge: Strong understanding of ETL processes and tools. Analytical Skills: Ability to analyze complex data and provide actionable insights. Data Modeling: Knowledge of data modeling, database design, and data warehousing concepts. Problem-Solving: Strong problem-solving skills and attention to detail. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to non-technical audiences. Preferred Qualifications: Experience in Python or R for data analysis. Knowledge of cloud-based data platforms (e.g., AWS, Azure, GCP). Certifications in Power BI or Tableau. Familiarity with DAX and advanced SQL queries. Location: Chennai Brand: Paragon Time Type: Full time Contract Type: Permanent

Posted 1 day ago

Apply

0 years

3 - 5 Lacs

Hyderābād

On-site

GlassDoor logo

Principal Data Scientist JOB TITLE : Principal Data Scientist Technical Skill / Delivery: Ability to prepare model training data in BigQuery and GCS Strong SQL skills working with complex queries (Aggregate/ Window functions/CASE statements). Excellent scripting and coding skills, able to code using well defined patterns Strong Machine Learning modeling skills and MLOpsAble to build/deploy/support end to end (Data Ingestion/Prep – Model Training – Dashboard) machine learning systems. Expertise with machine learning frameworks and libraries (e.g., TensorFlow, PyTorch, scikit-learn) Query optimization to improve data preparation performance Hands on experience with Google Data and ML Products (e.g. BigQuery, VertexAI Notebooks and Pipelines, etc.). Experience in data processing using SQL and PySpark Able to build/deploy end to end (Data Ingestion/Prep – Model Training – Dashboard) machine learning systems. Ability to lead Machine Learning projects end to end (Data Ingestion – Data Prep – Model Training – Deployment – Dashboarding). Working with JIRA/Confluence. Best Practices: Strong knowledge in DevOps/MLOps best practices Experience in using multiple ML Libraries in cross platform implementations (Python/R) Experience in Machine Learning Training/Prediction/Deployment implementation. Business Domain: Good understanding of internal data domains (e.g. Empower, Oracle, Kronos, Employee, NICE). Good understanding of Contact Center Switch data Soft Skills Ability to write clear concise code. Strong organizational skills. Able to work with cross functional teams in problem solving and adding business value with AI-ML. Strong team player. Good verbal & written communication skills Must be a fast learner Ability to document processes in the Machine Learning Lifecycle Strong team player by contributing to the innovative ideas that enhance our AI-ML capabilities. Strong documentation skills. Build strong relationship with business partners. About TTEC Our business is about making customers happy. That's all we do. Since 1982, we've helped companies build engaged, pleased, profitable customer experiences powered by our combination of humanity and technology. On behalf of many of the world's leading iconic and hypergrowth brands, we talk, message, text, and video chat with millions of customers every day. These exceptional customer experiences start with you. TTEC is proud to be an equal opportunity employer where all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. TTEC embraces and is committed to building a diverse and inclusive workforce that respects and empowers the cultures and perspectives within our global teams. We aim to reflect the communities we serve, by not only delivering amazing service and technology, but also humanity. We make it a point to make sure all our employees feel valued, belonging, and comfortable being their authentic selves at work. As a global company, we know diversity is our strength because it enables us to view things from different vantage points and for you to bring value to the table in your own unique way. Primary Location : India-Telangana-Hyderabad

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

Overview: We are seeking a talented Mid-Level Quality Assurance Engineer to join our team. As a Mid QA, you will play a pivotal role in ensuring the quality and performance of our products through both automation and manual testing. This position offers an opportunity to thrive in a collaborative environment and make significant contributions to the success of our rapidly evolving solutions. Roles & Responsibilities Review functional requirements and delineate acceptance criteria. Develop and automate test cases, adhering to best practices and emphasizing speed and reusability. Construct efficient and scalable automation solutions. Implement comprehensive automation strategies, taking into account interconnections and dependencies across multiple teams. Expand automation to encompass service provisioning, test data management, dashboarding, and business process validation. Design and execute test plans through manual testing when necessary. Identify, assess, and document issues with appropriate priority and severity. Assess existing automated and performance test coverage, identifying any testing gaps. Collaborate with team members to troubleshoot and resolve defects promptly. Regularly review and update automation tests and framework to ensure dependable results. Explore and implement initiatives and tools to enhance automation and performance testing. Maintain clear communication and close collaboration with other engineers within the team and across teams. Fulfil other tasks and responsibilities as delegated. Required Skills & Experience 3 to 5 years of experience in a software testing or quality assurance role. Minimum 2+ Years of experience in developing automation test scripts Strong understanding of software testing principles and methodologies. Strong knowledge of JavaScript. Experience with Cypress and Nightwatch JS. Solid analytical and troubleshooting skills. Experience with Agile/Scrum methodology or similar development processes. Excellent communication and teamwork skills. Detail-oriented with a commitment to quality. Hands-on experience in using Jira Hands on experience in using test management tool like Zephyr or any other similar tools. Familiarity with version control systems (e.g., Git, SVN). Proficiency in SQL. Preferred Skills Familiarity with CI/CD pipelines and associated tools (e.g., Jenkins, GitLab CI). Experience with containerization technologies such as Docker. Familiarity with developing performance test scripts. Knowledge of other programming languages such as Python or Java. Knowledge of performance testing tools like JMeter or Gatling. Understanding of cloud platforms (e.g., AWS, Azure, Google Cloud Platform).

Posted 1 day ago

Apply

1.0 - 5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Experience Required : 1 - 5 years Must have knowledge of Production Application Support. Good to be from e-commerce/Retail background. Must have solid Level 2, Level 1 support experience in eCommerce platforms such as Shopify, Blue Yonder, MAO, ATG, Sales Force Commerce. Hands on experience in Monitoring, Logging, Alerting, Dashboarding, and report generation in any monitoring tools such as AppDynamics/Splunk/Dynatrace/Datadog/CloudWatch/ELK/Prome/New Relic), Pager duty, slacks, Opsgenie. Must have knowledge of ITIL framework specifically on Alerts, Incident, change management, CAB, Production deployments, Risk and mitigation plan. Should be able to lead P1 calls, brief about the P1 to customer, proactive in gathering leads/ customers into the P1 calls till RCA. Experience working with postman. Should have knowledge of building and executing SOP, runbooks, handling any ITSM platforms (JIRA/ServiceNow/BMC Helix) Should know how to work with the Dev team, cross functional teams across time zones and be willing to work on rotational shifts.

Posted 1 day ago

Apply

1.0 - 5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Experience Required : 1 - 5 years Must have knowledge of Production Application Support. Good to be from e-commerce/Retail background. Must have solid Level 2, Level 1 support experience in eCommerce platforms such as Shopify, Blue Yonder, MAO, ATG, Sales Force Commerce. Hands on experience in Monitoring, Logging, Alerting, Dashboarding, and report generation in any monitoring tools such as AppDynamics/Splunk/Dynatrace/Datadog/CloudWatch/ELK/Prome/New Relic), Pager duty, slacks, Opsgenie. Must have knowledge of ITIL framework specifically on Alerts, Incident, change management, CAB, Production deployments, Risk and mitigation plan. Should be able to lead P1 calls, brief about the P1 to customer, proactive in gathering leads/ customers into the P1 calls till RCA. Experience working with postman. Should have knowledge of building and executing SOP, runbooks, handling any ITSM platforms (JIRA/ServiceNow/BMC Helix) Should know how to work with the Dev team, cross functional teams across time zones and be willing to work on rotational shifts.

Posted 1 day ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are Reckitt Home to the world's best loved and trusted hygiene, health, and nutrition brands. Our purpose defines why we exist: to protect, heal and nurture in the relentless pursuit of a cleaner, healthier world. We are a global team united by this purpose. Join us in our fight to make access to the highest quality hygiene, wellness, and nourishment a right and not a privilege. Information Technology & Digital In IT and D, you'll be a force for good, whether you're championing cyber security, defining how we harness the power of technology to improve our business, or working with data to guide the innovation of consumer loved products. Working globally across functions, you'll own your projects and process from start to finish, with the influence and visibility to achieve what needs to be done. And if you're willing to bring your ideas to the table, you'll get the support and investment to make them happen. Your potential will never be wasted. You'll get the space and support to take your development to the next level. Every day, there will be opportunities to learn from peers and leaders through working on exciting, varied projects with real impact. And because our work spans so many different businesses, from Research and Product Development to Sales, you'll keep learning exciting new approaches. About The Role The Reckitt a Data and Analytics (D&A) organisation owns and delivers the Data and AI platform to better equip markets to improve their decisions and optimize their actions through the use of improved data, insights and advanced analytics. The D&A team consists of six (6) key areas of expertise: Platform & Architecture, Data Science, Data Engineering, Data Visualisation, Data Governance and Strategy & Operations. Your responsibilities Taking ownership of the global Reckitt data platform aligning the roadmap with technical advancements and business requirements derived from IT&D product owners to ensure delivery of the required capabilities and latest technical standards to enable Analytics and AI capabilities. Setting the policy and standards for the AI Platform to ensure it operates according to Reckitt responsible AI principles and acting as a member of the responsible AI governance team. Evangelizing use of the common AI platform for global and local Reckitt business AI use cases to ensure utilising common tooling, capabilities and guardrails to ensure Reckitt continues to operate at the highest ethical levels Document the approach by developing a written vision & strategy outlining the overall approach to advancing our data capabilities aligned to key stakeholders across the global Reckitt organisation. Define the objectives and KPI’s for the data platform to measure success against business outcomes and data platform as product, its adoption and Finops. Providing technical leadership on proof of concepts to establish architectural patterns and blueprints for new products and service offerings. The experience we're looking for A technical degree in computer science or related quantitative field with more than 10 years’ experience working on data platforms and overall IT experience of 15+ years. Demonstrated experience working over cloud platforms (Azure / GCP) to deliver data product capabilities with a particular focus on using DataBricks. Extensive technical experience working hands on with data platforms (e.g. running SQL queries, operating monitoring services), FinOps (e.g. cost monitoring, dashboarding), data observability (data quality, data lineage) and MLOps Line management experience (5 people or more) and experience in agile development methodology, scrum in particular The skills for success Place your text here What we offer With inclusion at the heart of everything we do, working alongside our four global Employee Resource Groups, we support our people at every step of their career journey, helping them to succeed in their own individual way. We invest in the wellbeing of our people through parental benefits, an Employee Assistance Program to promote mental health, and life insurance for all employees globally. We have a range of other benefits in line with the local market. Through our global share plans we offer the opportunity to save and share in Reckitt's potential future successes. For eligible roles, we also offer short-term incentives to recognise, appreciate and reward your work for delivering outstanding results. You will be rewarded in line with Reckitt's pay for performance philosophy. Equality We recognise that in real life, great people don't always 'tick all the boxes'. That's why we hire for potential as well as experience. Even if you don't meet every point on the job description, if this role and our company feels like a good fit for you, we still want to hear from you. All qualified applicants will receive consideration for employment without regard to age, disability or medical condition; colour, ethnicity, race, citizenship, and national origin; religion, faith; pregnancy, family status and caring responsibilities; sexual orientation; sex, gender identity, gender expression, and transgender identity; protected veteran status; size or any other basis protected by appropriate law.

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

TCS Hiring for Senior Dynatrace Cloud Dashboard Engineer Experience: 6 to 10 Years Only Job Location: Delhi,Kolkata,Chennai, Hyderabad, Bangalore, Pune TCS Hiring for Senior Dynatrace Cloud Dashboard Engineer Required Technical Skill Set: Senior Dynatrace Cloud Dashboard Engineer(Power Dashboards in Dynatrace, including Widgets (tiles, graphs, heatmaps) Senior Dynatrace Cloud Dashboard Engineer Primary Responsibilities: Design and implement Power Dashboards in Dynatrace Cloud for complex enterprise environments. Gather requirements from stakeholders to build meaningful, actionable, and visually compelling dashboards. Translate monitoring and business KPIs into effective Dynatrace visualizations using DQL (Dynatrace Query Language). Leverage Dynatrace Notebooks, Metrics, Logs, Traces, and Events to provide deep insights into application and infrastructure health. Create reusable dashboard templates and scalable dashboarding strategies. Ensure dashboards adhere to UX best practices and performance optimization guidelines. Integrate Dynatrace with external data sources and visualization tools if needed. Provide guidance and training to internal teams on Power Dashboard usage and customization. Collaborate with SREs, Developers, and Product Owners to promote observability best practices. Required Qualifications: 5 to 7 years of hands-on experience with Dynatrace Cloud, with a focus on custom dashboarding. Deep expertise in building Power Dashboards in Dynatrace, including: Widgets (tiles, graphs, heatmaps) Filtering and tagging strategies Use of Dynatrace Query Language (DQL) Strong experience working with Metrics, Logs, Traces, and Business Events in Dynatrace. Understanding of application performance monitoring (APM) and observability concepts. Experience in visual storytelling: structuring dashboards to communicate insights effectively. Proficient in dashboard performance tuning and optimization. Experience with cloud platforms such as AWS, Azure, or GCP. Familiarity with REST APIs and automation of dashboard deployment (via Terraform, Dynatrace APIs, or configuration-as-code). Strong problem-solving skills and attention to detail. Excellent communication and stakeholder management skills. Preferred Qualification: Dynatrace certifications (e.g., Dynatrace Professional, Dynatrace Associate). Experience integrating Dynatrace with ITSM tools (e.g., ServiceNow), alerting systems, or CI/CD pipelines. Background in Site Reliability Engineering, DevOps, or Application Support. Familiarity with other observability platforms like Grafana, New Relic, or DataDog. Kind Regards, Priyankha M

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

TCS Hiring for Senior Dynatrace Cloud Dashboard Engineer Experience: 6 to 10 Years Only Job Location: Delhi,Kolkata,Chennai, Hyderabad, Bangalore, Pune TCS Hiring for Senior Dynatrace Cloud Dashboard Engineer Required Technical Skill Set: Senior Dynatrace Cloud Dashboard Engineer(Power Dashboards in Dynatrace, including Widgets (tiles, graphs, heatmaps) Senior Dynatrace Cloud Dashboard Engineer Primary Responsibilities: Design and implement Power Dashboards in Dynatrace Cloud for complex enterprise environments. Gather requirements from stakeholders to build meaningful, actionable, and visually compelling dashboards. Translate monitoring and business KPIs into effective Dynatrace visualizations using DQL (Dynatrace Query Language). Leverage Dynatrace Notebooks, Metrics, Logs, Traces, and Events to provide deep insights into application and infrastructure health. Create reusable dashboard templates and scalable dashboarding strategies. Ensure dashboards adhere to UX best practices and performance optimization guidelines. Integrate Dynatrace with external data sources and visualization tools if needed. Provide guidance and training to internal teams on Power Dashboard usage and customization. Collaborate with SREs, Developers, and Product Owners to promote observability best practices. Required Qualifications: 5 to 7 years of hands-on experience with Dynatrace Cloud, with a focus on custom dashboarding. Deep expertise in building Power Dashboards in Dynatrace, including: Widgets (tiles, graphs, heatmaps) Filtering and tagging strategies Use of Dynatrace Query Language (DQL) Strong experience working with Metrics, Logs, Traces, and Business Events in Dynatrace. Understanding of application performance monitoring (APM) and observability concepts. Experience in visual storytelling: structuring dashboards to communicate insights effectively. Proficient in dashboard performance tuning and optimization. Experience with cloud platforms such as AWS, Azure, or GCP. Familiarity with REST APIs and automation of dashboard deployment (via Terraform, Dynatrace APIs, or configuration-as-code). Strong problem-solving skills and attention to detail. Excellent communication and stakeholder management skills. Preferred Qualification: Dynatrace certifications (e.g., Dynatrace Professional, Dynatrace Associate). Experience integrating Dynatrace with ITSM tools (e.g., ServiceNow), alerting systems, or CI/CD pipelines. Background in Site Reliability Engineering, DevOps, or Application Support. Familiarity with other observability platforms like Grafana, New Relic, or DataDog. Kind Regards, Priyankha M

Posted 1 day ago

Apply

7.0 years

0 Lacs

India

On-site

Linkedin logo

At Global Analytics, we’re driving HEINEKEN’s transformation into the world’s leading data-driven brewer. Our innovative spirit flows through the entire company, promoting a data-first approach in every aspect of our business. From brewery operations and logistics to IoT systems and sustainability monitoring, our smart data products are instrumental in accelerating growth and operational excellence. As we scale our analytics and observability solutions globally, we are seeking a Grafana Developer to join our dynamic Global Analytics team. About the Team: The Global Analytics team at HEINEKEN is a diverse group of Data Scientists, Data Engineers, BI Specialists, and Translators, collaborating across continents. Our culture promotes experimentation, agility, and bold thinking. Together, we transform raw data into impactful decisions that support HEINEKEN’s vision for sustainable, intelligent brewing. Grafana Developer We are looking for a Grafana Developer to build and maintain real-time dashboards that support our IoT monitoring, time series analytics, and operational excellence initiatives.This is a hands-on technical role where you will collaborate with multiple teams to bring visibility to complex data across global operations. If you are excited to: Build real-time dashboards and monitoring solutions using Grafana. Work with InfluxDB, Redshift, and other time-series and SQL-based data sources. Translate complex system metrics into clear visual insights that support global operations. Collaborate with engineers, DevOps, IT Operations, and product teams to bring data to life. Be part of HEINEKEN’s digital transformation journey focused on data and sustainability. And if you like: A hybrid, flexible work environment with access to cutting-edge technologies. Working on impactful projects that monitor and optimize global brewery operations. A non-hierarchical, inclusive, and innovation-driven culture. Opportunities for professional development, global exposure, and knowledge sharing. Your Responsibilities: Design, develop, and maintain Grafana dashboards and visualizations for system monitoring and analytics. Work with time-series data from InfluxDB, Prometheus, Elasticsearch, and relational databases like MySQL, PostgreSQL, and Redshift. Optimize dashboard performance by managing queries, data sources, and caching mechanisms. Configure alerts and notifications to support proactive operational monitoring. Collaborate with cross-functional teams, including DevOps, IT Operations, and Data Analytics, to understand and address their observability needs. Utilize Power BI (optional) to supplement dashboarding with additional reports. Customize and extend Grafana using plugins, scripts, or automation tools as needed. Stay current with industry trends in data visualization, real-time analytics, and Grafana/Power BI ecosystem. We Expect: 4–7 years of experience developing Grafana dashboards and time-series visualizations. Strong SQL/MySQL skills and experience working with multiple data sources. Hands-on experience with Grafana and common data backends such as InfluxDB, Prometheus, PostgreSQL, Elasticsearch, or Redshift. Understanding of time-series data vs. traditional data warehouse architecture. Familiarity with scripting languages (e.g., JavaScript, Python, Golang) and query languages like PromQL. Experience configuring alerts and automating monitoring workflows. Exposure to Power BI (nice-to-have) for report building. Experience with DevOps/IT Ops concepts (monitoring, alerting, and observability tooling). Knowledge of version control (Git) and working in Agile/Scrum environments. Strong problem-solving mindset, clear communication skills, and a proactive attitude. Why Join Us: Be part of a globally recognized brand committed to innovation and sustainability. Join a team that values data transparency, experimentation, and impact. Shape the future of brewing by enabling data-driven visibility across all operations. Work in an international, collaborative environment that encourages learning and growth. If you are passionate about monitoring systems, making time-series data actionable, and enabling real-time decision-making, we invite you to join Global Analytics at HEINEKEN. Your expertise will help shape the future of our digital brewery operations.

Posted 1 day ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

JOB_POSTING-3-71879-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology

Posted 1 day ago

Apply

6.0 years

0 Lacs

Greater Bengaluru Area

On-site

Linkedin logo

This job is with Capco, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job Description Background Verification (BGV) Specialist Location: Bangalore Department: Human Resources Reports To: HR Operations Manager Employment Type: Full-time Job Summary We are seeking a detail-oriented and process-driven BGV Specialist to manage and oversee the end-to-end background verification process. This role involves handling employee and TA queries, managing vendor relationships, ensuring policy compliance, and driving process improvements. The ideal candidate will be proactive, organized, and capable of working cross-functionally with internal and external stakeholders to ensure timely and accurate BGV execution. Key Responsibilities Employee & TA Query Management: Respond to and resolve employee and TA queries related to BGV processes and delays with 100% timely closures and proper documentation. Stakeholder Management: Collaborate with Leaders and TA teams to prioritize critical onboardings and communicate changes in BGV scope. Escalation Handling: Investigate and resolve escalated BGV cases within defined SLAs. Vendor Management: Conduct weekly calls with vendors, streamline processes, and ensure 100% compliance with SLAs and process standards. Invoice Validation: Review and validate vendor invoices for accuracy and coordinate with Finance for timely processing. SOP Maintenance: Update Standard Operating Procedures (SOPs) to reflect changes in BGV scope and ensure audit readiness. Audit Support: Prepare and present documentation for internal and external audits, ensuring 100% compliance. Process Improvement: Identify and implement at least one process improvement per quarter to enhance efficiency and compliance. Policy Compliance Monitoring: Ensure all BGV activities align with internal policies and legal requirements. Vendor Performance Review: Evaluate vendor performance weekly based on SLA, TAT, and quality metrics. Documentation & Record Keeping: Maintain accurate and retrievable BGV records for compliance and audit purposes. Reporting & Dashboarding: Generate weekly dashboards and MIS reports to track BGV metrics and SLA adherence. Training & Onboarding: Train new HR and TA team members on BGV processes and tools within onboarding timelines. Tool/Portal Management: Manage access, configurations, and updates for BGV tools and portals. Confidentiality & Data Privacy: Ensure secure handling and storage of sensitive candidate data. Verification Risk Oversight: Monitor and resolve risks in the verification lifecycle, including delays and discrepancies. Vendor Onboarding: Facilitate the onboarding of new vendors in line with policy and operational timelines. Qualifications Bachelor’s degree in Human Resources, Business Administration, or related field 4–6 years of experience in background verification or HR operations Strong understanding of compliance, data privacy, and audit requirements Proficiency in Excel, HRMS tools, and BGV portals Excellent communication and stakeholder management skills If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.

Posted 2 days ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

The role involves and covers following support from Business Analyst (BA) perspective. Work as a Business Analyst, Understanding of data and data architecture, metrics and the importance of these in the design and delivery of new dashboards. Able, to identify and engage data owners Ability to translate business requirements into technical requirements. Ability to a process mapping and have experience in Risk Transformation. Should have knowledge on requirement gathering and BRD/FRD documentation Relevant experience of controls required to assure quality and completeness of the solution and its data The ability to challenge what we are doing, to take our dashboard designs to the next level in terms of design and technology use. Experience of working with cross-line-of-business working groups to design dashboards and approve solutions Dashboard delivery Demonstrable experience of designing dashboards to meet the requirements of a diverse group of users. A solid understanding of risk management. Knowledge of the latest dashboard technologies and trends. Other Desirable experience: Programme delivery on strategic programmes Dealing with exec level management TOM development Regulatory Requirements mapping and gap analysis Governance and Reporting and MI / dashboarding delivery Knowledge of Counterparty Credit Risk, Exposure calculation methodologies (simulation, aggregation, limit monitoring). Experience of implementing both Modelled and Non-Modelled calculation algorithms. Previous experience of capturing & analyzing the daily movement of EAD numbers for Financing Products, calculating the counterparty credit risk. Previous experience of validated counterparty exposure on a daily, monthly, and quarterly basis using various metrics including Exposure metrics (PFE, EPE, EEPE, EAD etc ) and VAR computation using both Internal Model (IMM) and Standardized approaches like CEM. Hands-on Experience of Exposure Calculation (EAD/PFE) at Portfolio level for both Modeled (IMM) & Non-Modeled (CEM/SACCR , Credit VAR ,CEF) transactions. Working knowledge of calculating & reporting default risk for traded products. Understanding of adjustments at the counterparty level where traded product exposure (derivatives, debt and equity financing) was found to be erroneous and material to mitigate impact on risk monitoring, CVA, and RWA. Some exposure to credit risk reporting platforms and risk engine. Skills and Qualifications CFA / FRM certification is a plus. Strong analytical skills and statistical background

Posted 2 days ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Pune

Remote

Naukri logo

Role: Retool Developer (Data Engineer) Location: Remote (Anywhere in India only) Shift: US - CST Time Department: Data Engineering Job Summary: We are looking for a highly skilled and experienced Retool Expert to join our team. In this role, you will be responsible for designing, developing, and maintaining internal tools and dashboards using the Retool platform. You will work closely with various teams to understand their needs and build effective solutions that improve operational efficiency and data visibility. Job Responsibilities: 1. Design and build custom internal tools, dashboards, and applications using Retool to meet specific business requirements. 2. Connect Retool applications to various data sources, including SQL databases, Realtime Queues, Data Lakes, and APIs. 3. Write and optimize SQL queries to retrieve, manipulate, and present data effectively within Retool. 4. Utilize basic JavaScript to enhance Retool application functionality, create custom logic, and interact with data. 5. Develop interactive data visualizations and reports within Retool to provide clear insights and support data-driven decision-making. 6. Collaborate with business stakeholders and other technical teams to gather requirements, provide technical guidance, and ensure solutions align with business goals. 7. Troubleshoot, debug, and optimize Retool applications for performance and scalability. 8. Maintain clear documentation of Retool applications, including design, data connections, and logic. 9. Stay up-to-date with the latest Retool features and best practices to continually improve our internal tools. Qualifications: 1. Strong proficiency in SQL for data querying, manipulation, and database management. 2. Solid understanding of basic JavaScript for scripting, custom logic, and enhancing user experience within Retool. 3. Demonstrated expertise in data visualization, including the ability to create clear, insightful, and user-friendly charts and graphs. 4. Ability to translate business needs into technical solutions using Retool. 5. Excellent problem-solving skills and attention to detail. 6. Strong communication and collaboration skills to work effectively with technical and non-technical teams. Preferred Qualifications (Bonus Points): 1. Experience with other low-code/no-code platforms. 2. Familiarity with UI/UX principles for building intuitive interfaces. Why Join Atidiv? 100% Remote | Flexible Work Culture Opportunity to work with cutting-edge technologies Collaborative, supportive team that values innovation and ownership Work on high-impact, global projects

Posted 2 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Adaptive Planning and GL Description Veradigm is seeking an Adaptive Planning Administrator to join our applications management IT team. In this role you will be responsible for managing Workday Adaptive Planning platform as part of our Workday FIN, HCM and Adaptive platform. You will be part of our Workday IT team in supporting the Workday environment. You will be responsible for managing the Adaptive Planning platform to maximize the effectiveness of financial planning and report in collaborating with FP&A and Accounting teams. This role will also include some responsibilities in Workday FIN, such as maintaining organization and worktag members and hierarchies as well as supporting Workday GL processes. Veradigm is a United States base company, therefore working in US hours may be required. Responsibilities Manage the Adaptive Planning Model including the management of the Levels, Dimensions & Attributes. Manage the creating and updating of Workday Organizations and Worktags as well as the Adaptive Planning Levels, Dimensions and Accounts and attributes. Develop metric accounts and Adaptive Planning formulas/calculations per business requirements. Manage the input sheets and models to align with the FP&A planning requirements to facilitate efficient and effective forecasting. Monitor GL/ HCM adaptive integrations for timely and proper execution and take corrective actions. Collaborate with the FP&A team to align Adaptive Planning functionality with business requirements. Manage Workday ledger related configurations, collaborating with the Accounting and FP&A teams. Manage Adaptive user access rules and permissions sets as well as assisting in support of Workday FIN user/group security. Provide user support for Adaptive Planning and OfficeConnect Ability to maintain Adaptive Planning reporting and dashboards would be an added advantage. Qualifications Financial systems support and development experience with Adaptive Planning including model building & management, data integrations, dashboarding and reporting as well as access and permission set security OfficeConnect experience Data analysis skills and understanding of financial data and terminology. Experience working closely with FP&A and Accounting teams Excellent communication skills Candidate should be willing to work in US afternoon shift ( 2:30 to 11:30 PM IST ) or Evening shifts ( 5 PM to 2 AM IST ). We are an Equal Opportunity Employer. No job applicant or employee shall receive less favorable treatment or be disadvantaged because of their gender, marital or family status, color, race, ethnic origin, religion, disability or age; nor be subject to less favorable treatment or be disadvantaged on any other basis prohibited by applicable law. Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill set? If so, please scroll down and tell us more about yourself!

Posted 2 days ago

Apply

2.0 years

0 Lacs

India

On-site

Linkedin logo

About Ascendeum: We provide AdTech strategy consulting to leading internet websites and apps hosting over 200 million monthly audiences worldwide. Since 2015, our consultants and engineers have consistently delivered intelligent solutions that enable enterprise-level websites and apps to maximize their digital advertising returns. About the Role: We are seeking a highly analytical and detail-oriented Offshore Marketing & Data Analyst to support our growing analytics team. This role will focus on performance reporting, campaign analysis, and dashboard development across marketing channels. You will be responsible for transforming complex data into actionable insights and automated reporting for internal and client stakeholders. Key Responsibilities Collect, analyze, and interpret marketing performance data across paid media, website, and CRM platforms. Build and maintain dashboards in tools like Tableau or Looker for internal teams and client reporting. Use SQL to query structured data sources and generate custom views or data extracts. Work with Google Analytics 4 (GA4) and understand user journey behavior, conversion paths, and attribution logic. Interpret and analyze media metrics Collaborate with internal teams to support campaign tracking implementation and QA of data tags across platforms like Google Tag Manager . Assist in performance audits, pacing analysis, and campaign optimization recommendations. Build data pipelines or transformations using Python (basic scripting and automation). Support ad hoc requests for data and analysis. Required Skills and Qualifications 2+ years in a marketing analytics, business intelligence, or data analyst role. Proficiency in GA4 and understanding of media buying platforms (Google Ads, Meta Ads, DSPs, etc.). Hands-on experience with dashboarding tools such as Tableau, Looker, or Power BI. Strong understanding of media performance metrics and digital KPIs. Proficient in SQL for data extraction, joins, and aggregations. Familiarity with Python for data wrangling and automation. Understanding of tagging and tracking methodologies , including UTM parameters, pixels, and tag managers. Ability to QA marketing tracking setups and identify discrepancies in data. Strong communication and time management skills, with the ability to work autonomously. CTC bracket: up to 25 LPA Thank you for your interest in joining Ascendeum.

Posted 2 days ago

Apply

Exploring Dashboarding Jobs in India

Dashboarding has become an essential skill in the data analytics and business intelligence domain, with increasing demand for professionals who can create visually appealing and insightful dashboards. Job seekers in India have a plethora of opportunities in this field, with companies across various industries looking to hire talented dashboarding professionals.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The salary range for dashboarding professionals in India varies based on experience and location. Entry-level positions typically start at INR 3-5 lakhs per annum, while experienced professionals can earn anywhere between INR 8-15 lakhs per annum.

Career Path

In the field of dashboarding, a typical career path may involve starting as a Junior Dashboard Developer, progressing to a Senior Dashboard Developer, and eventually becoming a Dashboard Tech Lead. With experience and expertise, professionals can also explore roles such as Dashboard Architect or Business Intelligence Manager.

Related Skills

In addition to proficiency in dashboarding tools like Tableau, Power BI, or Qlik Sense, professionals in this field are often expected to have strong data visualization skills, knowledge of SQL and database management, and a good understanding of business intelligence concepts.

Interview Questions

  • How would you design a dashboard to track key performance indicators for a sales team? (medium)
  • Explain the difference between a filter and a parameter in Tableau. (basic)
  • Can you walk us through a time when you had to optimize a dashboard for better performance? (advanced)
  • What are some best practices for creating interactive dashboards for end users? (medium)
  • How do you handle missing data in your dashboard visualizations? (basic)
  • Describe a challenging dashboard project you worked on and how you overcame obstacles. (advanced)
  • What role does storytelling play in dashboard design? (medium)
  • How do you ensure that your dashboards are user-friendly and intuitive for non-technical audiences? (basic)
  • Explain the concept of dashboard drill-down and how it can be implemented in Tableau. (advanced)
  • What methods do you use to validate the accuracy and reliability of data in your dashboards? (medium)
  • How do you stay updated with the latest trends and features in dashboarding tools? (basic)
  • Can you discuss a time when you had to collaborate with a cross-functional team to deliver a dashboard project? (advanced)
  • What are some common pitfalls to avoid when designing dashboards? (medium)
  • How would you approach dashboard performance tuning in Power BI? (advanced)
  • What is your experience with mobile-responsive dashboard design? (basic)
  • How do you handle conflicting requirements from different stakeholders when designing a dashboard? (medium)
  • Explain the importance of color theory in dashboard design. (basic)
  • How do you ensure data security and confidentiality in your dashboard visualizations? (medium)
  • What metrics would you include in a dashboard to measure customer retention and churn rate? (advanced)
  • How do you incorporate user feedback to improve the usability of your dashboards? (medium)
  • Can you demonstrate your experience with creating calculated fields in Tableau? (basic)
  • How would you approach data cleaning and preprocessing before building a dashboard? (medium)
  • Describe a time when you had to troubleshoot and resolve technical issues in a dashboard project. (advanced)
  • What are your thoughts on the future of dashboarding and data visualization technologies? (medium)

Closing Remark

As you explore opportunities in the dashboarding job market in India, remember to showcase your skills, experience, and passion for creating impactful data visualizations. Prepare diligently for interviews, stay updated with the latest trends in dashboarding tools, and apply confidently to pursue a rewarding career in this dynamic field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies