Home
Jobs

11086 Visualization Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

India

On-site

Linkedin logo

At Global Analytics, we’re driving HEINEKEN’s transformation into the world’s leading data-driven brewer. Our innovative spirit flows through the entire company, promoting a data-first approach in every aspect of our business. From brewery operations and logistics to IoT systems and sustainability monitoring, our smart data products are instrumental in accelerating growth and operational excellence. As we scale our analytics and observability solutions globally, we are seeking a Grafana Developer to join our dynamic Global Analytics team. About the Team: The Global Analytics team at HEINEKEN is a diverse group of Data Scientists, Data Engineers, BI Specialists, and Translators, collaborating across continents. Our culture promotes experimentation, agility, and bold thinking. Together, we transform raw data into impactful decisions that support HEINEKEN’s vision for sustainable, intelligent brewing. Grafana Developer We are looking for a Grafana Developer to build and maintain real-time dashboards that support our IoT monitoring, time series analytics, and operational excellence initiatives.This is a hands-on technical role where you will collaborate with multiple teams to bring visibility to complex data across global operations. If you are excited to: Build real-time dashboards and monitoring solutions using Grafana. Work with InfluxDB, Redshift, and other time-series and SQL-based data sources. Translate complex system metrics into clear visual insights that support global operations. Collaborate with engineers, DevOps, IT Operations, and product teams to bring data to life. Be part of HEINEKEN’s digital transformation journey focused on data and sustainability. And if you like: A hybrid, flexible work environment with access to cutting-edge technologies. Working on impactful projects that monitor and optimize global brewery operations. A non-hierarchical, inclusive, and innovation-driven culture. Opportunities for professional development, global exposure, and knowledge sharing. Your Responsibilities: Design, develop, and maintain Grafana dashboards and visualizations for system monitoring and analytics. Work with time-series data from InfluxDB, Prometheus, Elasticsearch, and relational databases like MySQL, PostgreSQL, and Redshift. Optimize dashboard performance by managing queries, data sources, and caching mechanisms. Configure alerts and notifications to support proactive operational monitoring. Collaborate with cross-functional teams, including DevOps, IT Operations, and Data Analytics, to understand and address their observability needs. Utilize Power BI (optional) to supplement dashboarding with additional reports. Customize and extend Grafana using plugins, scripts, or automation tools as needed. Stay current with industry trends in data visualization, real-time analytics, and Grafana/Power BI ecosystem. We Expect: 4–7 years of experience developing Grafana dashboards and time-series visualizations. Strong SQL/MySQL skills and experience working with multiple data sources. Hands-on experience with Grafana and common data backends such as InfluxDB, Prometheus, PostgreSQL, Elasticsearch, or Redshift. Understanding of time-series data vs. traditional data warehouse architecture. Familiarity with scripting languages (e.g., JavaScript, Python, Golang) and query languages like PromQL. Experience configuring alerts and automating monitoring workflows. Exposure to Power BI (nice-to-have) for report building. Experience with DevOps/IT Ops concepts (monitoring, alerting, and observability tooling). Knowledge of version control (Git) and working in Agile/Scrum environments. Strong problem-solving mindset, clear communication skills, and a proactive attitude. Why Join Us: Be part of a globally recognized brand committed to innovation and sustainability. Join a team that values data transparency, experimentation, and impact. Shape the future of brewing by enabling data-driven visibility across all operations. Work in an international, collaborative environment that encourages learning and growth. If you are passionate about monitoring systems, making time-series data actionable, and enabling real-time decision-making, we invite you to join Global Analytics at HEINEKEN. Your expertise will help shape the future of our digital brewery operations.

Posted 20 hours ago

Apply

2.0 years

0 Lacs

India

On-site

Linkedin logo

Location: In-Person | Novel Tech Park, HSR Layout, Bengaluru Type: Full-Time What We’re Building At Sapience1, we’re on a mission to transform how families discover and access youth services from academics and enrichment to life skills and care using behavioral AI, intelligent design, and seamless technology. We’re not building just another app. We’re engineering the future of Human Experience Tech where services feel smart, personal, and human. We’re looking for a builder — someone who thrives in fast-moving environments and loves turning complex challenges into real-world products. Job Description We are seeking a versatile MERN Stack Developer to build and maintain the web-based dashboards for Sapience1's internal teams: the Support Web Dashboard (Admin/Operator) and the Business Web Dashboard (Product Owner). You will be responsible for developing robust and efficient web applications using MongoDB, Express.js, React.js, and Node.js. Your work will empower our operations and business teams with critical data, analytics, and administrative controls. Responsibilities Develop full-stack web applications using the MERN stack (MongoDB, Express.js, React.js, Node.js) for the Support and Business Dashboards. Design and implement responsive and intuitive user interfaces for complex data visualization and administrative functions. Build secure and scalable backend APIs using Node.js and Express.js to interact with MongoDB. Develop modules for user management, session monitoring, trust & safety flagging, ticketing, and reporting for the Support Dashboard. Implement features for platform metrics, financial oversight, promotions management, and AI performance monitoring for the Business Dashboard. Ensure seamless integration with other backend services and APIs, including those related to Hello Edison™ AI. Collaborate with UI/UX designers to translate wireframes and mockups into functional code. Implement robust authentication, authorization (role-based access control), and security measures for the dashboards. Write clean, well-documented, and testable code, adhering to coding standards. Qualifications Bachelor's degree in Computer Science, Engineering, or a related field. 2+ years of experience in MERN Stack development. Strong proficiency in JavaScript, Node.js, Express.js, and React.js. Experience with MongoDB and Mongoose (or other ORM/ODM). Understanding of RESTful APIs and asynchronous programming. Familiarity with MVC/MVVM architecture patterns. Experience with version control systems (Git). Ability to work in an agile development environment. Knowledge of data visualization libraries is a plus. Compensation Competitive, based on experience. 12,00,000 LPA

Posted 20 hours ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Digital Analyst – Web & Experience Analytics Job Summary The Digital Analyst will drive insights into user behavior and site performance across digital landscape. This role supports experience optimization through advanced analytics, journey analysis, and actionable reporting for stakeholders across UX, Product, and Marketing. Key Responsibilities ● Analyze user journeys, funnels, traffic patterns, and engagement metrics to inform experience strategy. ● Develop dashboards and performance reports using tools like Tableau or Power BI. ● Translate complex analytics into clear business insights for cross-functional teams. ● Collaborate with optimization and marketing teams to support A/B testing and personalization analysis. ● Ensure data quality through validation of tagging and analytics setup (via GTM or Adobe Launch). ● Identify user behavior trends to surface actionable UX and content insights. Required Skillset & Experience ● Minimum 5 years of experience in digital analytics or web data analysis. ● Strong knowledge of GA4, Adobe Analytics, or other enterprise analytics tools. ● Experience with visualization tools such as Tableau, Power BI, or Looker. ● Ability to translate data into clear, actionable insights. ● Good understanding of customer journeys and digital experience KPIs. ● Familiarity with tagging frameworks and GTM/Adobe Launch. Technology Platforms & Tools ● GA4 ● Adobe Analytics ● GTM / Adobe Launch ● Tableau / Power BI / Looker Studio ● Hotjar ● Excel/Sheets

Posted 20 hours ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are seeking a skilled Data Engineer with over 6+ years of experience to design, build, and maintain scalable data pipelines and perform advanced data analysis to support business intelligence and data-driven decision-making. The ideal candidate will have a strong foundation in computer science principles, extensive experience with SQL and big data tools, and proficiency in cloud platforms and data visualization tools. Key Responsibilities: Design, develop, and maintain robust, scalable ETL pipelines using Apache Airflow, DBT, Composer, Control-M, Cron, Luigi, and similar tools. Build and optimize data architectures including data lakes and data warehouses. Integrate data from multiple sources ensuring data quality and consistency. Collaborate with data scientists, analysts, and stakeholders to translate business requirements into technical solutions. Analyze complex datasets to identify trends, generate actionable insights, and support decision-making. Develop and maintain dashboards and reports using Tableau, Power BI, and Jupyter Notebooks for visualization and pipeline validation. Manage and optimize relational and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB, and DynamoDB. Work with big data tools and frameworks including Hadoop, Spark, Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Utilize cloud data services and warehouses like AWS Glue, GCP Dataflow, Azure Data Factory, Snowflake, Redshift, and BigQuery. Support CI/CD pipelines and DevOps workflows using Git, Docker, Terraform, and related tools. Ensure data governance, security, and compliance standards are met. Participate in Agile and DevOps processes to enhance data engineering workflows. Required Qualifications: 6+ years of professional experience in data engineering and data analysis roles. Strong proficiency in SQL and experience with database management systems such as MySQL, PostgreSQL, Oracle, and MongoDB. Hands-on experience with big data tools like Hadoop and Apache Spark. Proficient in Python programming. Experience with data visualization tools such as Tableau, Power BI, and Jupyter Notebooks. Proven ability to design, build, and maintain scalable ETL pipelines using tools like Apache Airflow, DBT, Composer (GCP), Control-M, Cron, and Luigi. Familiarity with data engineering tools including Hive, Kafka, Informatica, Talend, SSIS, and Dataflow. Experience working with cloud data warehouses and services (Snowflake, Redshift, BigQuery, AWS Glue, GCP Dataflow, Azure Data Factory). Understanding of data modeling concepts and data lake/data warehouse architectures. Experience supporting CI/CD practices with Git, Docker, Terraform, and DevOps workflows. Knowledge of both relational and NoSQL databases, including PostgreSQL, BigQuery, MongoDB, and DynamoDB. Exposure to Agile and DevOps methodologies. Experience with Amazon Web Services (S3, Glue, Redshift, Lambda, Athena) Preferred Skills: Strong problem-solving and communication skills. Ability to work independently and collaboratively in a team environment. Experience with service development, REST APIs, and automation testing is a plus. Familiarity with version control systems and workflow automation.

Posted 21 hours ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Role : Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise of .Net Core & MVC, Entity Framework along with ADO.Net · Experience of angular v6+ and typescript · Experience of html/UI/CSS · Experience of REST Api · Knowledge on Sql Server. · Understanding of OOPS concept, LINQ/Lambda Roles and Responsibilities: · Creating UI/html design as per the requirement. · Bind data with custom control with custom validation & logic building as per requirement. · API integration and changes as per requirement. · Understand client requirements. · Prepare Json data object using model & create REST Api as per requirement. · Develop complex logic as per business requirement using entity framework. · Handle exceptions for both api & db and creating/modifying stored procedure/functions etc. Mandatory skill sets: .Net Core/ Angular v6+/Fullstack Preferred skill sets: .Net Core/ Angular v6+/Fullstack Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills .NET Core, AngularJS Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 21 hours ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

About Company BDO India LLP (or BDO India) is the India member firm of BDO International. BDO India offers advisory, accounting, tax & digital services for both domestic and international organizations across a range of industries. BDO India is led by more than 350+ Partners & Directors with a team of over 10000 professionals operating across 14 cities. We expect to grow sizably in the coming 3-5 years, adding various dimensions to our business and multiplying the increasing the current team size multi-fold. About Technology Services In the world of business today, the words ‘disrupt, innovate, and transform’ have become daily goals. For companies to gain recognition, digital transformation is key. It increases profitability & retention, enhances governance, controls & compliance, and overall organisation growth. However, most companies struggle to meet their digital transformation goals. At BDO in India, we assist clients in their digital transformation journey, enabling them to realign the way they do business and create more value. Leveraging our deep industry knowledge and extensive experience, our industry experts provide a holistic approach to clients looking for a successful digital transformation. Whether the focus is on automating tax compliance, enhancing customer interaction, increasing profitability through operational excellence, or improving employee experience and retention, we apply a proven framework and methodology that includes professionals with knowledge and experience. Overall objective of the position: The Data/Report Analyst is responsible for the continuous improvement of situational awareness and data maturity via creation of analytics and reporting capability. This includes: Identifying analytics requirements – working with the Business leadership to determine their: data and information use cases. data and information requirements -capturing and developing business rules/logic and relevant metrics. Sourcing and modelling data. This may involve: Creating suitable target-state data models for analysis. integrating source systems or extracting data from databases as required (in conjunction with DBAs & dev team). performing any data transformations necessary to create target-state datasets. Developing analytics – creating reports and information artefacts to provide real time intelligence and predictive capability. Creation of reports and dashboards in PowerBi in accordance with business requirements. Finding and optimizing relations between various data requirements for reporting. Key Accountabilities: Day-to-day responsibilities and expectations for this position: Developing well documented and defined use cases and requirements. Modelling and populating appropriate datasets for analysis. Creating information artefacts to support operational management and oversight. Creation of interactive reporting in a PowerBI environment Administration data visualization services across a variety of Microsoft cloud technologies. Creation, publication, configuration, operation, and maintenance of data visualization / solution dashboard. Standardisation and documentation of common procedures that cannot be easily automated, for execution by other resources or teams. Proactive monitoring of the data visualization infrastructure and associated services. Identifying analytics and reporting requirements – working with the Business leadership to determine and document their reporting requirements. Data engineering by designing, deploying and maintain ETL / ELT logic in a modern cloud-based data estate. Experienced in designing and implementing data platform, reporting and analytics solutions in the Microsoft Azure ecosystem. Provide customer support in addition to general duties and contribution to team objectives. Automation of common procedures used to resolve Incidents, Problems and Change Requests. Qualifications, Experience, Skills: Relevant qualifications in IT or a related discipline are desirable but not essential. Basic, proven (1-3 years) experience in Power BI. Knowledge of M-language and DAX are required. Certifications are desirable but not essential. Skills in data extraction and systems integration highly desirable. Appropriate communication and inter-personal skills, particularly the ability to work across international cultures. Strong written communication skills are also required to produce reports and technical documentation. Experience in business analysis, data modelling and reporting highly regarded. Experience with Azure or other cloud environments and/or IT Service Management functions desirable. Demonstrate intermediate knowledge in the following technologies: Azure SQL Database/ My SQL Database Azure Cosmos DB/ Mongo DB 8.Demonstrate intermediate-to-advanced knowledge in the following technologies: ETL techniques and principles Data modelling Data Visualization Microsoft Power BI Power Query DAX T-SQL 9.Experience working with custom internally developed applications and interacting with development teams preferred. 10.Effectively demonstrates leadership, teamwork, problem solving, initiative and integrity. 11.Excellent interpersonal and written communication skills; technical writing experience preferred. 12. Knowledge of following technologies will be an advantage Azure Data Factory / Azure Synapse Pipelines Azure Data Lake Azure Analysis Services

Posted 21 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

- - - - - - - - - - - - Job Summary: The Digital Business Analyst will be a key actor for the creation and implementation of a new Business Intelligence (BI) system that will enhance the visibility of sales, forecasts, market share details across various departments, including Finance, Marketing, Sales, and other areas. This role involves working closely with business stakeholders to understand their needs, developing and configuring BI solutions, and ensuring the effective use of data for decision-making. Key Responsabilités Collaborate with cross-functional teams and business stake holders to gather, analyze and challenge business requirements for BI solutions. Work with IT and Data Analyst teams to develop, configure, and model BI solutions. Ensure the BI system is scalable and capable of handling a large number of users effectively. Conduct data validation and quality assurance to ensure the accuracy and reliability of BI outputs. Continuously monitor and optimize the BI system for performance and usability. Participate in the maintenance and evolution of the BI solution in support of the Product Owner. Coordinate with Data Owners, Business Data Stewards, and other stakeholders to align the BI product with the company's data strategy. Provide training and support to end-users to maximize the utilization of the BI system and effective data-driven decision-making. Assist and animate the BI Champions and support requests for the development or enhancement of BI solutions Provide end-to-end data and business expertise to ensure high-quality processes and outcomes. Occasionally handle data visualization tasks related to value creation. Supervise and guide junior business analysts as needed. Key Achievements Expected Ensure the build, deployment and evolution of a new BI according the project plan : Successfully design, implement, and maintain a comprehensive BI solution that delivers key business indicators linked to sales, forecasts, market share, and other critical metrics. Assist the Product Owner in planning and prioritizing developments and ensuring the availability of new data and functionalities. Develop/Design BI reports and dashboards : Create or design detailed reports and dashboards to track important metrics and provide valuable insights to answer the business needs. Write the user stories required. Collaborate with cross-functional teams and Business Stakeholders : Work closely with various departments to gather and analyze business requirements for BI solutions. User Training: Train users on data utilization and visualization to ensure effective data-driven decision-making. Ensure scalability and user management : Make sure the BI system is scalable and capable of handling a large user base effectively Conduct data validation and quality assurance : Ensure the accuracy and reliability of BI outputs through rigorous data validation and quality assurance processes. Optimize system performance : Continuously monitor and optimize the BI system for performance and usability. And Ensure data availability and functionality within the required timelines and quality standards. Ensure the lifecycle management of BI solutions, including collecting evolution needs, improvements, and problem resolution. Support BI Champions: Assist BI Champions and support requests for the development or enhancement of BI solutions. Technologies Power BI Power App Power Automate Sharepoint SQL, Azure Databricks (Pyspark) Good to have

Posted 21 hours ago

Apply

3.0 years

0 Lacs

Jharkhand, India

On-site

Linkedin logo

WHO ARE WE: Tata Steel Foundation is a wholly owned subsidiary of Tata Steel Limited and was instituted on August 16, 2016. With over 1500 members spread over ten units and six states of Jharkhand, West Bengal, Odisha, Uttarakhand, Maharashtra, and Punjab, the Foundation is a CSR implementing organization focused upon co-creating solutions, with tribal and excluded communities, to address their development challenges. The organization is committed to playing a larger role in India’s sustainable development by embedding wider economic, social, and environmental objectives through its programmes and underscored by a vision to create an enlightened and equitable society. Location of Posting : Jharkhand/ Odisha. Employment Type: Contractual Purpose of the Role: As a Data Coordinator at TSF, you will collect, collate, analyze, and disseminate data and information related to our programmes, thereby enabling informed decision-making and impact measurement. This position requires attention to detail, excellent organizational skills, and a passion for using data to drive positive change. Key Deliverables: Develop data collection systems for the different programmes and ensure timely data collection. Manage, organize, and maintain databases. Conduct regular data audits and visit the field to identify and address data quality issues or discrepancies and ensure the veracity of data. Generate reports, dashboards, and visualizations, Conduct data analysis in regular basis to present findings and insight to stakeholders. Ensure timely submission of MIS reports and presentations, in collaboration with programme teams. Provide training and support to programme staff in data management and report generation. Ensure compliance with data protection, privacy, as per security regulations and policies of TSF. Support internal and external audits related to data management and reporting. Minimum Qualification: Bachelor's degree in a relevant field such as Management Information Systems, Data Science, Computer Science, or a related discipline. Work experience in years : Candidates with working experience in the development sector will be given preference. Technical Skill Sets: Proven experience (3 years) in data management, coordination, reporting, and analysis. Experience in the development sector / CSR data management is preferred. Expertise in MS Excel, including data cleaning, advanced pivot, Power Query, macro, Visual Basic, statistical analysis, inter alia. Proficiency in MIS software applications, databases, and statistical software (e.g. SPSS) is desirable. Familiarity with data visualization tools (e.g. Power BI) is desirable. Strong analytical and problem-solving skills with attention to detail. Effective communication skills with the ability to present findings in a clear and concise manner. Behavioral Skill Sets: Aligned to TATA Values (Integrity, Responsibility, Excellence, Pioneering, Unity, and Respect) Ability to engage across stakeholder groups, peers, and communities Strong interpersonal skills and a collaborative approach Self-motivated, result-oriented, sensitive to cultures and diversity Good written and verbal communication skills.

Posted 21 hours ago

Apply

8.0 years

0 Lacs

Bhilai, Chhattisgarh, India

On-site

Linkedin logo

Job Description We are seeking a highly skilled and self-driven Senior GIS Analyst with deep expertise in ArcGIS Pro to join our team. The ideal candidate will have 5–8 years of experience working with spatial data, delivering geospatial solutions, and engaging directly with clients to understand requirements, present findings, and drive project execution. Roles & Responsibilities Design, develop, and manage GIS-based solutions using ArcGIS Pro and related Esri tools. Conduct spatial data analysis, geoprocessing, and visualization in both 2D and 3D environments. Automate workflows and support advanced geospatial modeling using ArcPy/Python. Lead end-to-end project execution, including requirement gathering, solution design, implementation, and delivery. Serve as a primary point of contact for clients – participate in meetings, understand business problems, and translate them into technical GIS workflows. Collaborate cross-functionally with data engineers, developers, and business analysts to integrate geospatial intelligence into broader business solutions. Stay updated with Esri product releases and GIS industry trends. Skills Qualifications and Skills 5–8 years of professional experience working with ArcGIS Pro, ArcGIS Online/Enterprise, and related Esri tools. Strong proficiency in spatial analysis, data management, and map creation. Hands-on experience with Python scripting (ArcPy) for automating tasks and performing advanced geoprocessing. Experience working with geodatabases, feature classes, shapefiles, and spatial data formats. Proven experience in client interactions, stakeholder communication, and project ownership. Strong problem-solving skills with attention to detail. Excellent verbal and written communication skills. Preferred Qualifications Experience with ArcGIS Enterprise or ArcGIS Online publishing and administration. Familiarity with web GIS, dashboards, and StoryMaps. Experience integrating GIS with business intelligence tools or relational databases. Knowledge of spatial databases (PostGIS, SQL Server Spatial, etc.) is a plus. Experience 5-8 Years

Posted 21 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

General Skills & Experience: Minimum 10-18 yrs of Experience • Expertise in Spark (Scala/Python), Kafka, and cloud-native big data services (GCP, AWS, Azure) for ETL, batch, and stream processing. • Deep knowledge of cloud platforms (AWS, Azure, GCP), including certification (preferred). • Experience designing and managing advanced data warehousing and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse). • Proven experience with building, managing, and optimizing ETL/ELT pipelines and data workflows for large-scale systems. • Strong experience with data lakes, storage formats (Parquet, ORC, Delta, Iceberg), and data movement strategies (cloud and hybrid). • Advanced knowledge of data modeling, SQL development, data partitioning, optimization, and database administration. • Solid understanding and experience with Master Data Management (MDM) solutions and reference data frameworks. • Proficient in implementing Data Lineage, Data Cataloging, and Data Governance solutions (e.g., AWS Glue Data Catalog, Azure Purview). • Familiar with data privacy, data security, compliance regulations (GDPR, CCPA, HIPAA, etc.), and best practices for enterprise data protection. • Experience with data integration tools and technologies (e.g. AWS Glue, GCP Dataflow , Apache Nifi/Airflow, etc.). • Expertise in batch and real-time data processing architectures; familiarity with event-driven, microservices, and message-driven patterns. • Hands-on experience in Data Analytics, BI & visualization tools (PowerBI, Tableau, Looker, Qlik, etc.) and supporting complex reporting use-cases. • Demonstrated capability with data modernization projects: migrations from legacy/on-prem systems to cloud-native architectures. • Experience with data quality frameworks, monitoring, and observability (data validation, metrics, lineage, health checks). • Background in working with structured, semi-structured, unstructured, temporal, and time series data at large scale. • Familiarity with Data Science and ML pipeline integration (DevOps/MLOps, model monitoring, and deployment practices). • Experience defining and managing enterprise metadata strategies.

Posted 21 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

you’ll be our:Vehicle Service Partnerships Manager (RSA) you’ll be based at: Bangalore you’ll be Aligned with:Lead - Vehicle Service Strategy you’ll be A member of: Service & Support What will I be doing at Ather? Vendor Management: Negotiate and maintain contracts with RSA vendors to secure competitive pricing. Monitor vendor operations and ensure adherence to agreed-upon Service Level Agreements (SLAs). Operational Oversight: Train and monitor the RSA contact center on business rules and processes. Serve as the primary point of contact for the contact center and vendors for real-time support on live RSA incidents, requiring availability and responsiveness. Track and monitor overall RSA incident rates, analyzing data for insights into vehicle failure trends. Financial Management: Verify the legitimacy of vendor billing, ensuring accurate amounts are invoiced. Collect and maintain comprehensive documentation for all RSA cases for audit purposes. Manage the end-to-end invoice process, including uploading bills into internal systems, obtaining necessary approvals, correctly attributing expenses across internal Profit & Loss (P&L) centers, and ensuring timely vendor payments. Customer Focus: Collect and analyze customer feedback related to RSA experiences to identify areas for improvement. Strategic Improvement: Proactively plan and implement enhancements to RSA systems and processes to optimize efficiency and customer satisfaction. Drive profitability of the RSA portfolio by ensuring the achievement of plan sales targets. Technical Understanding: Leverage basic automobile technology knowledge to effectively manage and understand the context of RSA incidents. What kind of experience & skills do I need for this role? Domain Skills/Knowledge: Understanding of roadside assistance operations and best practices. Basic knowledge of automobile technology to comprehend the nature of RSA incidents. Experience in vendor management, including negotiation and contract management. Familiarity with financial processes such as invoicing, payment processing, and budget management. Knowledge of customer service principles and feedback analysis. Understanding of Service Level Agreements (SLAs) and their importance in vendor relationships. Technical Skills: Proficiency in data analysis and reporting tools (e.g., Excel, data visualization software). Experience with CRM or similar systems for managing vendor and case information. Familiarity with internal financial systems for invoice processing and expense attribution. Work Experience: 3-5 years of experience in vendor management, partnerships, or operations, preferably within the automotive or service industry. Experience in managing external service providers and monitoring their performance. Exposure to financial processes and budget management. Behavioral Competencies Delivering Results Influencing People Relationship Management Some attributes that you bring to this role? Strong negotiation and communication skills for effective vendor management. Excellent analytical and problem-solving abilities to identify trends and resolve operational issues. Ability to be responsive and make decisions under pressure, given the 24x7 nature of the service. Strong organizational skills and attention to detail for managing contracts, invoices, and documentation. Proactive approach to identifying and implementing process improvements. Ability to work independently and take ownership of the RSA portfolio's performance. What should I have graduated in? A Bachelor's / Master’s degree in Business Administration, Operations Management, Engineering (preferably Mechanical or Automotive), or a related field.

Posted 21 hours ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Role: Data Engineer YOE:8 to 10Y Locations: Bangalore, Hyderabad, Pune, Mumbai, Job Description : We are seeking an experienced Senior Data Engineer to join our dynamic team. The ideal candidate will have over 8 years of experience in designing and implementing large-scale data processing pipelines. The role requires extensive hands-on expertise in Python and SQL scripting, with a strong understanding of modern data frameworks like Apache Airflow and Apache Spark. Exposure to cloud technologies, specifically AWS (S3, Lambda, Glue), is essential. The successful candidate will be instrumental in building, maintaining, and optimizing data systems that power data-driven decisions across the organization. Key Responsibilities: Design, develop, and optimize data pipelines and workflows to process large-scale datasets efficiently. Implement and manage Apache Airflow for orchestrating ETL pipelines and ensuring data availability. Use Apache Spark for distributed data processing, optimization, and ensuring high-performance analytics. Write clean, efficient, and scalable Python scripts for data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries for data analysis and reporting. Manage cloud data storage (primarily AWS S3) and compute services (AWS Lambda, AWS Glue) to support data operations. Collaborate with cross-functional teams (Data Analysts, Data Scientists, Business Analysts) to ensure data availability, integrity, and quality. Troubleshoot and optimize existing systems to ensure smooth data flow and minimal downtime. Lead the technical direction for the development of data engineering practices across the team. Required Skills and Qualifications: 8+ years of professional experience in data engineering, data analysis, or related fields. Expertise in Python for building and optimizing data processing workflows. Advanced proficiency in SQL for querying and manipulating large datasets. Experience with Apache Airflow for orchestrating complex data workflows and managing dependencies. Solid experience in Apache Spark for distributed data processing and optimization of performance. Exposure to AWS technologies, specifically S3 (for storage), Lambda (for serverless compute), and Glue (for ETL and data cataloging). Strong understanding of data modeling, data pipelines, and ETL best practices. Experience with version control tools like Git . Ability to work with large, complex datasets and troubleshoot performance bottlenecks. Strong problem-solving skills and attention to detail. Ability to work independently and in collaborative team environments. Preferred Skills (Nice to Have): Experience with additional cloud services such as AWS Redshift , AWS Athena , or AWS EMR . Familiarity with data visualization tools like Tableau , Power BI , or similar. Experience with Docker or containerization for building deployable environments. Knowledge of CI/CD pipelines for automating data processing workflows. Thanks & Regards, Yuvaraj U Client Engagement Executive | Vy Systems Mobile: 9150019640 | Email: yuvaraj@vysystems.com www.vysystems.com

Posted 21 hours ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 27th June 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀

Posted 21 hours ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. The Role We are seeking a highly skilled and experienced Business Intelligence and Reporting Technical Lead to join our team. The ideal candidate will be responsible for leading the design, development, and implementation of BI solutions and reporting frameworks. This role requires a deep understanding of data analytics, strong technical skills, and the ability to collaborate with cross-functional teams to deliver actionable insights. Key Responsibilities Lead the development and maintenance of BI solutions, including data warehouses, dashboards, and reporting tools. Collaborate with business stakeholders to understand their data needs and translate them into technical requirements. Design and implement data models, ETL processes, and data integration solutions. Ensure data accuracy, integrity, and security across all BI platforms. Develop and maintain documentation for BI processes, standards, and best practices. Mentor and guide junior team members, fostering a culture of continuous learning and improvement. Stay up-to-date with the latest BI technologies and trends, recommending improvements and innovations. Troubleshoot and resolve BI-related issues, ensuring minimal disruption to business operations. What You’ll Bring Bachelor’s degree in Computer Science, Information Systems, or a related field. 5+ years of experience in business intelligence, data analytics, or a related role. Deep knowledge of Looker or Tableau for data visualization and reporting. Proficiency in SnapLogic ETL for data integration and transformation. Experience with Google Cloud Platform (GCP) and BigQuery for cloud-based data solutions. Strong SQL skills and experience with data warehousing concepts. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills, with the ability to work effectively with both technical and non-technical stakeholders. Experience with other BI tools such as QlikView is a plus. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.

Posted 21 hours ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Business Analyst Intern Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline: 27th June 2025 About Unified Mentor Unified Mentor empowers aspiring business analysts by providing practical, project-based learning to bridge the gap between academic theory and industry expectations. Responsibilities ✅ Analyze business processes, gather requirements, and identify areas for improvement. ✅ Prepare reports, dashboards, and visualizations to support decision-making. ✅ Assist in creating and documenting business cases, workflows, and use cases. ✅ Collaborate with stakeholders, developers, and teams to understand business needs. ✅ Support in market and competitor analysis as required. Requirements 🎓 Enrolled in or recent graduate of a Business, Analytics, or related program. 📊 Familiar with Excel, PowerPoint, and data visualization tools (Power BI/Tableau preferred). 🧠 Basic understanding of business analysis techniques and tools. 🗣 Excellent communication, analytical, and problem-solving skills. Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical exposure to real-world business problems and solutions. ✔ Certificate of Internship and Letter of Recommendation. ✔ Opportunity to contribute to live business analysis projects. Equal Opportunity Unified Mentor welcomes applicants from all backgrounds.

Posted 21 hours ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Purpose of the position The Group Finance Controller manages the accounting, financial reporting, and compliance activities for a group of companies with subsidiaries in various locations. This role supports the company in all accounting activities of both financial and management accounting, preparation of monthly, quarterly, and annual financial and performance reports, assisting departments with annual budgets and reviews, internal & external audits, and tax compliance. Responsibilities & duties Bookkeeping Assist data entry, and recording and maintaining accurate and complete financial records covering all group companies. Ensure accurate and timely closure of accounting activities and the Consolidation of financial statements. Address external and internal stakeholders queries and issues cross countries Reporting Prepare financial reports, such as balance sheets and income statements, and schedules, working with reporting and bookkeeping software. Compliance Ensure the compliance with appropriate regulations, legislation and accounting standards and group accounting policies. Ensuring tax accounting and record maintenance is in compliance with statutory regulations and tax laws caross countries Forecasting Lead and oversee the annual/quarterly/monthly planning processes, including, but not limited to budgeting and forecasts of the income statement, balance sheet, cashflow, etc. Improving Controls, processes and practices Lead the standardization and improvement of accounting processes, policies, procedures, internal controls, and systems to enhance accuracy and efficiency of the accounting department globally. Drive implementation and adoption of new-age technology accounting system and work with various stakeholders to drive SOPs Colloboration Collaborate across functions to recommend solutions, identify opportunities for improvement, and implement projects to increase productivity and automation across countries Additional responsibilities that arise Take initiative and/or direction on projects and responsibilities that go beyond formal responsibilities and that may arise as circumstances change. Academic & Formal Qualifications Bachelor's degree in Accounting, Business Administration or related field required or equivalent experience CPA or other finance-related certification demonstrating mastery of accounting, e.g. CMA, CA, MBA At least 5 years’ experience in managing the accounting activities Exposure to SAP, Exact Online or similar ERP, Excel, Power BI and other data analytics and visualization tools

Posted 21 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Data Visualization & UI/UX Designer (PowerBI & Power Solutions) About VOIS VOIS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VOIS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. VOIS India In 2009, VOIS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VOIS India supports global markets and group functions of Vodafone and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Location: Pune / Bangalore Working Persona: Hybrid Role Purpose: Talented and experienced Data Visualization and UI/UX Expert to join our dynamic team. In this role, you will play a pivotal role in creating compelling, user-friendly data visualizations and ensuring an exceptional user experience across our digital platforms. As a key member of our team, you will collaborate with various stakeholders to translate complex data into visually engaging and informative designs. Key Responsibilities: Data Visualization: o Create interactive and visually appealing data visualizations using tools such as PowerBI, Power BI, Power Solutions, or other relevant platforms. o Transform complex data sets into easy-to-understand charts, graphs, and dashboards. o Ensure data accuracy, consistency, and integrity in visualizations. UI/UX Design: o Design and implement user interfaces for web and mobile applications that prioritize user experience and usability. o Conduct user research, usability testing, and gather feedback to iterate on designs. o Collaborate with front-end developers to ensure seamless integration of UI/UX designs. Collaboration: o Work closely with cross-functional teams, including data analysts, developers, and product managers, to understand project requirements and objectives. o Communicate design concepts and rationale effectively to both technical and non-technical stakeholders. Continuous Improvement: o Stay updated with industry trends and best practices in data visualization and UI/UX design. o Propose and implement improvements to existing visualizations and designs. Qualifications: • Bachelor's degree in Graphic Design, HCI, Computer Science, or related field (Master's degree preferred). • Proven experience in data visualization and UI/UX design, with a strong portfolio showcasing your work. • Proficiency in data visualization tools (e.g., Power BI) and design tools (e.g., Adobe Creative Suite, Sketch, Figma). • Strong understanding of usability principles, user-centered design, and information architecture. • Familiarity with HTML, CSS, and JavaScript for UI implementation. • Excellent communication and collaboration skills. VOIS Equal Opportunity Employer Commitment VOIS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion, Top 10 Best Workplaces for Women, Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 22 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role and responsibilities: Leadership and Mentorship Team Leadership : Lead and mentor a team of Data Scientists and Analysts, guiding them in best practices, Advanced méthodologies, and carrer development. Project Management : Oversee multiple analytics projects, ensuring they are completed on time, within scope, and deliver impactful results. Innovation and Continuous Learning : Stay at the forefront of industry trends, new technologies, and méthodologies, fostering a culture of innovation within the team. Collaboration with Cross-Functional Teams Stakeholder Engagement : Work closely with key account managers, data analysts, and other stakeholders to understand their needs and translate them into data-driven solutions. Communication of Insights : Present complex analytical findings clearly and actionably to non-technical stakeholders, helping guide strategic business decisions. Advanced Data Analysis and Modeling Develop Predictive Models : Create and validate complex predictive models for risk assessment, portfolio optimization, fraud detection, and market forecasting. Quantitative Research : Conduct in-depth quantitative research to identify trends, patterns, and relationships within large financial datasets. Statistical Analysis : Apply advanced statistical techniques to assess investment performance, asset pricing, and financial risk. Business Impact and ROI Performance Metrics : Define and track key performance indicators (KPIs) to measure the effectiveness of analytics solutions and their impact on the firm's financial performance. Cost-Benefit Analysis : Perform cost-benefit analyses to prioritize analytics initiatives that offer the highest return on investment (ROI). Algorithmic Trading and Automation Algorithm Development : Develop and refine trading algorithms that automate decision-making processes, leveraging machine learning and AI techniques. Back testing and Simulation : Conduct rigorous back testing and simulations of trading strategies to evaluate their performance under different market conditions. What we're looking for Advanced Statistical Techniques : Expertise in statistical methods such as regression analysis, time-series forecasting, hypothesis testing, and statistics. Machine Learning and AI : Proficiency in machine learning algorithms and experience with AI techniques, particularly in the context of predictive modeling, anomaly detection, and natural language processing (NLP). Programming Languages : Strong coding skills in languages like Python, commonly used for data analysis, modeling, and automation. Data Management : Experience with big data technologies, and relational databases to handle and manipulate large datasets. Data Visualization : Proficiency in creating insightful visualizations that effectively communicate complex data findings to stakeholders. Cloud Computing : Familiarity with cloud platforms like AWS, Azure, or Google Cloud for deploying scalable data solutions. Quantitative Analysis : Deep understanding of quantitative finance, including concepts like pricing models, portfolio theory, and risk metrics. Algorithmic Trading : Experience in developing and back testing trading algorithms using quantitative models and data-driven strategies. Requirements : A bachelor's degree in a related field, such as computer science, data science or statistics.

Posted 22 hours ago

Apply

6.0 - 8.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

About BNP Paribas India Solutions Established in 2005, BNP Paribas India Solutions is a wholly owned subsidiary of BNP Paribas SA, European Union’s leading bank with an international reach. With delivery centers located in Bengaluru, Chennai and Mumbai, we are a 24x7 global delivery center. India Solutions services three business lines: Corporate and Institutional Banking, Investment Solutions and Retail Banking for BNP Paribas across the Group. Driving innovation and growth, we are harnessing the potential of over 10000 employees, to provide support and develop best-in-class solutions. About BNP Paribas Group BNP Paribas is the European Union’s leading bank and key player in international banking. It operates in 65 countries and has nearly 185,000 employees, including more than 145,000 in Europe. The Group has key positions in its three main fields of activity: Commercial, Personal Banking & Services for the Group’s commercial & personal banking and several specialised businesses including BNP Paribas Personal Finance and Arval; Investment & Protection Services for savings, investment, and protection solutions; and Corporate & Institutional Banking, focused on corporate and institutional clients. Based on its strong diversified and integrated model, the Group helps all its clients (individuals, community associations, entrepreneurs, SMEs, corporates and institutional clients) to realize their projects through solutions spanning financing, investment, savings and protection insurance. In Europe, BNP Paribas has four domestic markets: Belgium, France, Italy, and Luxembourg. The Group is rolling out its integrated commercial & personal banking model across several Mediterranean countries, Turkey, and Eastern Europe. As a key player in international banking, the Group has leading platforms and business lines in Europe, a strong presence in the Americas as well as a solid and fast-growing business in Asia-Pacific. BNP Paribas has implemented a Corporate Social Responsibility approach in all its activities, enabling it to contribute to the construction of a sustainable future, while ensuring the Group's performance and stability Commitment to Diversity and Inclusion At BNP Paribas, we passionately embrace diversity and are committed to fostering an inclusive workplace where all employees are valued, respected and can bring their authentic selves to work. We prohibit Discrimination and Harassment of any kind and our policies promote equal employment opportunity for all employees and applicants, irrespective of, but not limited to their gender, gender identity, sex, sexual orientation, ethnicity, race, colour, national origin, age, religion, social status, mental or physical disabilities, veteran status etc. As a global Bank, we truly believe that inclusion and diversity of our teams is key to our success in serving our clients and the communities we operate in. About Business Line/Function The Intermediate Holding Company (“IHC”) program structured at the U.S. level across poles of activities of BNP Paribas provides guidance, supports the analysis, impact assessment and drives adjustments of the U.S. platform’s operating model due to the drastic changes introduced by the Enhanced Prudential Standards (“EPS”) for Foreign Banking Organizations (“FBOs”) finalized by the Federal Reserve in February 2014, implementing Section 165 of U.S. Dodd-Frank Act. The IT Transversal Team is part of the Information Technology Group which works simultaneously on a wide range of projects arising from business, strategic initiatives, and regulatory changes and reengineering of existing applications to improve functionality and efficiency. Job Title Python Developer Date June-25 Department ITG- Fresh Location: Chennai, Mumbai Business Line / Function Finance Dedicated Solutions Reports To (Direct) Grade (if applicable) (Functional) Number Of Direct Reports NA Directorship / Registration NA Position Purpose The Python Developer will play a critical role in building and maintaining financial applications and tools that support data processing, analysis, and reporting within a fast-paced financial services environment. This position involves developing scalable and secure systems. The developer will collaborate with business analysts, finance users/or finance BA to translate complex business requirements into efficient, high-quality software solutions. A strong understanding of financial concepts, data integrity, and regulatory compliance is essential. The detailed responsibilities are mentioned below. Responsibilities Direct Responsibilities Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Strong communication skills to effectively collaborate with team members and stakeholders. Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. Technical & Behavioral Competencies Proficient in object-oriented programming, especially Python, with a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Strong communication skills to effectively collaborate with team members and stakeholders. Familiarity with the Software Development Life Cycle (SDLC) process and its various stages, including experience with JIRA and Confluence. Good analytical, problem solving, & communication skills Engage in technical discussions and to help in improving the system, process etc Nice to Have Familiarity with Plotly and Matplotlib for data visualization of large datasets. Skilled in API programming, handling JSON, CSV, and other unstructured data from various systems. Familiarity with JavaScript, CSS, and HTML. Experience with cloud architecture applications such as Dataiku or Databricks; competency with ETL tools. Knowledge of regulatory frameworks, RISK, CCAR, and GDPR. Skills Referential Specific Qualifications (if required) Behavioural Skills: (Please select up to 4 skills) Ability to collaborate / Teamwork Critical thinking Ability to deliver / Results driven Communication skills - oral & written Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to understand, explain and support change Ability To Develop Others & Improve Their Skills Choose an item. Education Level Bachelor Degree or equivalent Experience Level At least 5 years

Posted 22 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Exp – 15 to 23yrs Location : Chennai /Bangalore Primary skill :- GEN AI Architect, Building GEN AI solutions, Coding, AI ML background, Data engineering, Azure or AWS cloud Job Description : The Generative Solutions Architect will be responsible for designing and implementing cutting-edge generative AI models and systems. He / She will collaborate with data scientists, engineers, product managers, and other stakeholders to develop innovative AI solutions for various applications including natural language processing (NLP), computer vision, and multimodal learning. This role requires a deep understanding of AI/ML theory, architecture design, and hands-on expertise with the latest generative models. Key Responsibilities : GenAI application conceptualization and design: Understand the use cases under consideration, conceptualization of the application flow, understanding the constraints and designing accordingly to get the most optimized results. Deep knowledge to work on developing and implementing applications using Retrieval-Augmented Generation (RAG)-based models, which combine the power of large language models (LLMs) with information retrieval techniques. Prompt Engineering: Be adept at prompt engineering and its various nuances like one-shot, few shot, chain of thoughts etc and have hands on knowledge of implementing agentic workflow and be aware of agentic AI concepts NLP and Language Model Integration - Apply advanced NLP techniques to preprocess, analyze, and extract meaningful information from large textual datasets. Integrate and leverage large language models such as LLaMA2/3, Mistral or similar offline LLM models to address project-specific goals. Small LLMs / Tiny LLMs: Familiarity and understanding of usage of SLMs / Tiny LLMs like phi3, OpenELM etc and their performance characteristics and usage requirements and nuances of how they can be consumed by use case applications. Collaboration with Interdisciplinary Teams - Collaborate with cross-functional teams, including linguists, developers, and subject matter experts, to ensure seamless integration of language models into the project workflow. Text / Code Generation and Creative Applications - Explore creative applications of large language models, including text / code generation, summarization, and context-aware responses. Skills & Tools Programming Languages - Proficiency in Python for data analysis, statistical modeling, and machine learning. Machine Learning Libraries - Hands-on experience with machine learning libraries such as scikit-learn, Huggingface, TensorFlow, and PyTorch. Statistical Analysis - Strong understanding of statistical techniques and their application in data analysis. Data Manipulation and Analysis - Expertise in data manipulation and analysis using Pandas and NumPy. Database Technologies - Familiarity with vector databases like ChromaDB, Pinecone etc, SQL and Non-SQL databases and experience working with relational and non-relational databases. Data Visualization Tools - Proficient in data visualization tools such as Tableau, Matplotlib, or Seaborn. Familiarity with cloud platforms (AWS, Google Cloud, Azure) for model deployment and scaling. Communication Skills - Excellent communication skills with the ability to convey technical concepts to non-technical audiences.

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Note: Do not contact to the mobile number which is displaying in our company website. You can reach us through above mentioned mail id or thought this contact number 9790654970. Location: Coimbatore, Tamil Nadu, India. Company: Cogent Automation Private Limited. About Us: We are a leading innovator in automation solutions, dedicated to delivering cutting-edge technology to our clients across various industries."] Job Summary: We are seeking a highly skilled and experienced Electrical PLC Control Engineer in Special Purpose Machine (SPM) to design, develop, program, and commission automated control systems for our industrial processes. The ideal candidate will have a strong background in electrical engineering, programmable logic controllers (PLCs), human-machine interfaces (HMIs), and industrial networking, with a keen eye for detail and a commitment to delivering high-quality, reliable solutions. Key Responsibilities: Design & Development: Interpret electrical schematics, P&IDs, and functional specifications to design and develop PLC-based control systems. Select and specify appropriate electrical components, sensors, actuators, and control hardware. Develop detailed electrical drawings, wiring diagrams, and panel layouts using CAD software (e.g., AutoCAD Electrical, EPLAN). PLC Programming: Develop, test, and debug PLC programs from scratch or modify existing code using various PLC platforms (e.g., Siemens TIA Portal, Rockwell Studio 5000/Logix Designer, Mitsubishi GX Works, Omron CX-One). Implement control logic, algorithms, and safety interlocks in accordance with project requirements and industry standards. Optimize PLC programs for efficiency, reliability, and maintainability. HMI/SCADA Development: Design and develop user-friendly HMI/SCADA screens for process visualization, data acquisition, and operator control. Configure alarms, trends, recipes, and reporting functionalities within HMI/SCADA applications. Commissioning & Testing: Perform factory acceptance testing (FAT) and site acceptance testing (SAT) of control systems. Conduct I/O checks, troubleshoot electrical and programming issues, and optimize system performance during commissioning. Collaborate with cross-functional teams, including mechanical engineers, process engineers, and operators, to ensure successful system integration. Troubleshooting & Support: Provide technical support and troubleshooting for existing control systems, identifying and resolving electrical and programming faults. Participate in on-call rotations or provide after-hours support as needed. Documentation: Create and maintain comprehensive technical documentation, including electrical schematics, program comments, user manuals, and project reports. Ensure all documentation is accurate, up-to-date, and meets project standards. Continuous Improvement: Stay updated with the latest industry trends, technologies, and best practices in industrial automation. Identify opportunities for process improvement, cost reduction, and enhanced system performance. Adhere to all relevant safety regulations and company policies. Qualifications: Bachelor's degree in Electrical Engineering, Automation Engineering, Mechatronics, or a related field. 5+ years of proven experience as an Electrical PLC Control Engineer in an industrial automation environment. Strong proficiency in programming and configuring PLCs from at least one major vendor (e.g., Siemens, Rockwell Automation, Mitsubishi, Omron). Hands-on experience with HMI/SCADA development. Solid understanding of industrial communication protocols (e.g., Ethernet/IP, Profinet, Modbus TCP/RTU, OPC UA). Familiarity with electrical design software (e.g., AutoCAD Electrical, EPLAN). Knowledge of industrial safety standards (e.g., IEC 61508, IEC 62061, ISO 13849) is a plus. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication (written and verbal) and interpersonal skills. Ability to work independently and as part of a team. Willingness to travel to client sites for commissioning and support (if applicable, specify travel percentage). Preferred Skills (Nice to Have): Experience with robotic systems and integration. Knowledge of databases and data acquisition systems. Familiarity with functional safety programming. Certification in relevant PLC platforms. Benefits: Competitive salary, health insurance, paid time off, retirement plan, professional development opportunities, EL, CL, Subsidized food. To Apply: Interested candidates are invited to submit their resume to hr@cogentautomation.in.

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About INDmoney: INDmoney is a fast-growing super finance app revolutionizing the way users manage their investments and finances. As we scale our digital lending vertical, we seek dynamic professionals with a strong background in ML & data science to create, enhance and monitor our risk models and ensure robust underwriting decisions to scale business. Job Summary: We are looking for a Data Scientist, with a hybrid background in Data science, Credit Risk model development, Machine Learning and Data analysis. The ideal candidate will build intelligent, data-driven credit risk models with predictive modeling expertise using machine learning & AI. He will play a key role in refining credit strategies, reducing defaults, and driving healthy loan book growth. Key Responsibilities: Develop, validate, and implement credit risk models using credit bureau data, demographic indicators, transactional behavior, and alternative data sources. Leverage machine learning, Mathematical and statistical techniques to predict borrower behavior, default probability, and fraud risk. Design and refine scorecards and decision engines to automate and scale credit decisions. Monitor loan portfolio performance, identify risk trends, and recommend optimization strategies. Detect and mitigate fraud by spotting suspicious patterns using advanced analytics. Collaborate with Product, and Tech teams to ensure seamless integration of risk models into the lending journey. Stay updated on RBI digital lending guidelines and ensure regulatory compliance across risk models. Qualifications & Skills: Bachelor’s/Master’s degree in Finance, Economics, Data Science, Statistics, or a related field. 3–5 years in Data Science and Credit Risk Analysis. Must have: The candidate must have an experience in using data science for building lending credit risk models Hands-on experience building predictive models for credit risk using tools like Python, R, or similar. Machine Learning · Data Science · · Data Analytics · Python (Programming Language) · Data Visualization · SQL · Statistical Modeling Knowledge of credit bureau reports, financial statement analysis, and alternative credit scoring techniques. Experience with fraud detection models and risk-based pricing strategies is a plus. Strong problem-solving skills and a keen eye for data patterns and business impact.

Posted 22 hours ago

Apply

20.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Latinum : Latinum is seeking a seasoned and visionary Transformation Leader – SCM for one of its esteemed clients. This role is ideal for dynamic leader to drive large-scale transformation initiatives within the Supply Chain Management domain. This role requires deep SCM expertise, strategic leadership, and the ability to conceptualize and deliver end-to-end digital transformation solutions for global clients. About the Role: This role involves driving and owning end-to-end transformation programs across the SCM domain, collaborating with various teams, and leveraging deep SCM expertise to identify transformation opportunities. Responsibilities: Drive and own end-to-end transformation programs across the SCM domain. Collaborate with Delivery and Account teams to ensure compliance with productivity commitments and contractual SLAs. Leverage deep SCM domain expertise to identify transformation opportunities and design roadmap strategies tailored to client needs. Conceptualize and deploy digital assets and analytics solutions to enhance supply chain performance. Lead the design, change management, and implementation of digital initiatives aligned with business goals. Review and refine transformation proposals using in-depth SCM knowledge. Act as a strategic partner to clients—engaging with executive leadership (CIO, CPO, CSCO) to align business needs and transformation objectives. Guide project/program leaders and ensure the quality and efficiency of output. Identify risks, track project health, and implement effective governance and value realization frameworks. Collaborate with Delivery Excellence, Innovation, and Support teams to drive operational efficiency. Be a trusted advisor and thought leader in SCM transformations and process innovation. Qualifications: Education: Bachelor’s/Master’s degree in Supply Chain, Operations Management, or Digital Transformation. Experience: 15–20 years of experience in SCM outsourcing with at least 9–10 years in transformation leadership. Hands-on experience in Plan, Buy, Make, Deliver & Enable SCM areas with strong domain depth in at least one. Proven record of managing large-scale transformation projects from conceptualization to realization. Strong track record of client relationship management at the executive level. Experience working in matrix organizations and across cross-functional/global teams. Required Skills: Strong project management, analytical, and problem-solving skills. Exceptional communication and stakeholder management abilities. Self-starter with an eye for detail and the ability to work independently. Exposure to and passion for operating model design, process reengineering, digital technologies, intelligent automation, and AI/ML. Proficiency in Microsoft Office tools – Excel, PowerPoint, Word. Ability to influence executive leadership and lead transformation change programs across the organization. Deep understanding of value chain mapping, data & analytics, and customer experience design. Preferred Skills: Working knowledge of ERP systems like SAP, Oracle, JDE. Experience or certifications in: Lean Six Sigma (Black Belt / Master Black Belt) Agile / Scrum methodologies Project Management (PMP, Prince2) RPA / AI / ML solutions Process Mining tools (e.g., Celonis) and visualization tools like Power BI / Tableau. Exposure to benchmarking, analytics, consulting, and cross-functional transformation projects. What We Offer: An opportunity to lead high-impact SCM transformation programs globally. A collaborative, innovation-driven work culture. Visibility and engagement with CXO-level stakeholders . Competitive compensation and a comprehensive benefits package.

Posted 22 hours ago

Apply

1.0 - 2.0 years

0 Lacs

Lucknow, Uttar Pradesh, India

On-site

Linkedin logo

Company Description Gencosys Technologies Pvt. Ltd. is a premier provider of comprehensive Information Technology solutions, catering to top business segments across various geographies. The company has a strong presence and customer base in regions such as South Asia, the Middle East, Africa, Asia Pacific, the Kingdom of Saudi Arabia, and North America. Gencosys Technologies is dedicated to supporting its clients in overcoming IT challenges and achieving their business goals. Role Description This is a full-time on-site role for a PPC Expert, located in Lucknow. The PPC Expert will be responsible for managing and optimizing pay-per-click advertising campaigns across multiple platforms such as Google Ads, Bing Ads, and social media channels. Day-to-day tasks will include keyword research, campaign creation, performance analysis, bidding strategies, and budget management. The PPC Expert will also be responsible for creating reports, conducting A/B testing of ads and landing pages, and staying updated with industry trends to ensure the success of PPC campaigns. Qualifications Proficiency in Google Ads, Meta Ads, and social media advertising platforms Strong analytical skills and experience with tools like Google Analytics, Excel, and data visualization tools 1-2 Years of Experience with keyword research, campaign setup, bid management, and performance analysis Knowledge of A/B testing methodologies for ads and landing pages Excellent written and verbal communication skills Ability to work independently and collaboratively in a team environment Experience with budget management and performance optimization Bachelor's degree in Marketing, Business, or a related field

Posted 22 hours ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About the Company They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. About the Client Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations. Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines. Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries. Job Title : Java Developer Key Skills : AWS EKS and/or Lambda, Java, Kafka, Kotlin etc. Job Locations : PAN India Experience: 6+ Years Education Qualification : Any Graduation Work Mode : Hybrid Employment Type : Contractual Notice Period : Immediate - 10 Days Job description: Java Developer Key Responsibilities: Develop and Maintain Domain APIs on AWS EKS and/or Lambda Create high and low level design and implement capabilities using Micro services and Domain Driven Design principles Troubleshoot technical issues with in depth knowledge of technology and functional aspects Understand and build to Nonfunctional requirements like authorization access performance etc Assist in tracking and showcasing performance metrics and help optimize performance Provide innovative solutions for API Versioning strategies Assist in creation on Automated Test Suites which are interoperable across APIs Provide technical leadership to a team of developers ensuring adherence to best practices security standards and scalability Closely work with BA PO SM and other stakeholders to understand the requirements and ensure successfully delivery of product feature on time Participate in defect triage and analysis Support Go Live activities Requires Skills Qualifications: 6+ years of experience in developing Domain APIs using any distributed programming languages like Java, Kotlin etc. Must have a good understanding of SOAP and REST based integration patterns Knowledge of JSON and XML data structures Knowledge of SQL and No SQL Databases like MongoDB DynamoDB S3 PostgreSQL etc. Knowledge of various messaging services like AWS SQS AWS SNS RabbitMQ Kafka etc. Knowledge of AWS and Lambda Terraform Experience creating logs alerts and dashboard for visualization and troubleshooting Excellent problem solving and troubleshooting skills Ability to lead technical teams and mentor junior developers Communication skills both verbal and written ability to interact with stakeholders Knowledge of API Management bast practices and experience with API Gateways Understanding of security standards including OAuth SSO and encryption for integration and API security Work closely with Business Stakeholders to understand their needs and requirements and translate them into technical solutions Participate in endofiteration demos to showcase the key deliverables to IT and business stakeholders Tech Stack: AWS EKS and Lambda Programming Language Java Kotlin Framework Spring Boot DB MongoDB DynamoDB Redis Cache PostgreSQL S3 Messaging Interface AWS SQS AWS SNS Kafka RabbitMQ Terraform for infrastructure provisioning Knowledge of Open API Specs OAS and ACORD NGDS is an added advantage

Posted 22 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies