Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
vadodara, gujarat
On-site
You are a highly skilled SQL Developer with over 4 years of experience in SQL Database and database technologies. Your expertise includes proficiency in Extract Transform Load (ETL) processes and writing complex T-SQL queries, stored procedures, views, and functions to support application features and data transformations. You will be responsible for working on data migration projects, including data mapping, cleansing, transformation, and validation from legacy systems to modern platforms. Hands-on experience with MS Excel is a must for this role. In this role, you will play a key part in designing, developing, and optimizing enterprise-grade applications that handle large volumes of structured data. You will integrate these applications with multiple systems and support complex data transformation logic. Collaboration with analysts and stakeholders to understand data requirements and deliver accurate, high-performance solutions is a crucial aspect of this position. You will also be responsible for optimizing existing queries and processes for performance and scalability. Additionally, you will perform unit testing and assist in QA for data accuracy and system reliability. Your skills in Data Integration and ensuring data quality and consistency will be essential for the success of data processing and support of the overall data architecture. If you have familiarity with Accounting applications, it would be considered a plus. Excellent problem-solving and analytical skills, as well as the ability to work independently, are key attributes for this role.,
Posted 2 days ago
8.0 - 15.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Data Engineering Manager at Amgen, you will play a pivotal role in leading the end-to-end data strategy and execution for regulatory product submissions, lifecycle management, and compliance reporting within the Biotech or Pharmaceutical domain. Your primary responsibilities will revolve around ensuring the timely and accurate delivery of regulatory data assets across global markets by collaborating with cross-functional Regulatory Integrated Product Teams (IPT). Your key responsibilities will include: - Leading the engineering strategy for regulatory operations, encompassing data ingestion, transformation, integration, and delivery across regulatory systems. - Serving as the data engineering Subject Matter Expert (SME) within the Integrated Product Team to facilitate regulatory submissions, agency interactions, and lifecycle updates. - Collaborating with various departments such as global regulatory affairs, clinical, CMC, quality, safety, and IT teams to translate submission data requirements into data engineering solutions. - Overseeing the development of data pipelines, models, and metadata frameworks that adhere to submission data standards. - Enabling integration and reporting across regulatory information management systems and other relevant platforms. - Implementing data governance, lineage, validation, and audit trails to ensure regulatory compliance. - Guiding the development of automation solutions, dashboards, and analytics to enhance visibility into submission timelines and regulatory KPIs. - Ensuring interoperability between regulatory data platforms and enterprise data lakes for cross-functional reporting and insights. - Driving innovation by evaluating emerging technologies in data engineering and AI for regulatory intelligence. - Leading and mentoring a team of data engineers and analysts to foster a culture of excellence and innovation. - Implementing Agile methodologies to enhance team velocity and project delivery. The ideal candidate for this role should possess: - 12+ years of experience in data engineering, with at least 3 years in a managerial capacity, preferably within the biotech or pharmaceutical industry. - Proven experience in supporting regulatory functions and familiarity with ETL/ELT tools and cloud-based data platforms. - Deep understanding of regulatory standards, data compliance, and submission processes. - Strong project management, communication, and leadership skills. - Ability to translate technical capabilities into business outcomes and effectively work in cross-functional environments. While not mandatory, prior experience in integrated product teams or regulatory transformation programs, knowledge of Regulatory Information Management Systems, and familiarity with Agile methodologies are considered advantageous. In addition to technical expertise, soft skills such as analytical thinking, communication, teamwork, and self-motivation are highly valued in this role. A degree in Computer Science or related field, along with relevant certifications, is preferred. Amgen is an equal opportunity employer committed to diversity and inclusion in the workplace.,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
The Opportunity: You have the exciting opportunity to join Hitachi Energy in TTL Global Processes Continuous Improvement & Systems Team as a skilled and experienced SAP TM (Transportation Management Module) & S4HANA Business Partner Data Management Expert. This role goes beyond just a job, offering you a chance to be part of a vibrant and collaborative environment where your contributions are highly valued. You will be an integral part of a well-structured team comprising 3 Continuous Improvement Experts and two Team Leads, with a total team size of 40+ individuals. Strong collaboration and communication skills are vital in this role to facilitate exchanges with cross-business stakeholders, comprehend business requirements, and suggest optimal solutions. It is essential that you work from the office at least one day a week to ensure effective collaboration and communication within the team. If you are passionate about master data and SAP TM and seeking career growth in a supportive and innovative company, we are eager to hear from you. How You'll Make an Impact: - Perform regular Master Data SAP TM updates globally, regionally, and locally in alignment with the RFQ Calendar - Conduct cleansing of the TTL Business Partners Master Data in S4HANA - Manage TTL Business Partners Master Data in S4HANA - Offer functional expertise in SAP TM Master Data Rate Management to support Operational Teams and drive increased SAP TM usage in CoEs across different regions - Develop and implement the Strategy for the TTL Master Data - Contribute to Continuous Improvement Initiatives focusing on enhancing the correctness & completeness of the TTL Master Data - Collaborate closely with the BU Master Data Teams to improve the accuracy of input for the Shipment Planning & Execution Teams - Support SAP TM implementation in various countries/regions Your Background: - Hold a University degree in Business Administration or Supply Chain - Possess a minimum of 2 years of experience in SAP TM Master Data Management or any other Transportation Management System - Have expertise in SAP TM Module in Shipment Planning & Execution and S4HANA Business Partners Creation/Change - Demonstrate strong functional knowledge of SAP P2P and Q2C modules - Proficient in data analysis, data cleansing, and data transformation techniques - Showcase high capability in stakeholder & change management More About Us: Hitachi Energy is a global technology leader dedicated to advancing a sustainable energy future for all. We cater to customers in the utility, industry, and infrastructure sectors by providing innovative solutions and services across the value chain. Together with customers and partners, we drive technological advancements and enable the digital transformation necessary to expedite the energy transition towards a carbon-neutral future. With over 40,000 employees in 90 countries, we work purposefully each day, leveraging our diverse backgrounds to challenge the status quo. We invite you to apply today and become part of a global team that values the equation: Diversity + Collaboration = Great Innovation. We take pride in offering a comprehensive range of competitive benefits to support your financial, physical, and mental well-being, as well as personal development. Our aim is for you to thrive both at work and beyond. For more information, we are happy to provide additional details during the recruitment process. If you are a qualified individual with a disability and need a reasonable accommodation to access the Hitachi Energy career site, please complete a general inquiry form on our website. Be sure to include your contact information and specific details about the required accommodation to assist you during the job application process. Please note that this accommodation request process is exclusively for job seekers with disabilities needing accessibility assistance during the application process. Inquiries for other purposes will not receive a response.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
coimbatore, tamil nadu
On-site
The ideal candidate for this role should possess in-depth functional knowledge of the process area and be able to apply it to operational scenarios in order to provide effective solutions. It is essential to have the ability to identify discrepancies and propose optimal solutions using a logical, systematic, and sequential methodology. Being open-minded towards inputs and views from team members is crucial, along with the capability to effectively lead, control, and motivate groups towards company objectives. The candidate should be self-directed, proactive, and seize every opportunity to meet internal and external customer needs, ensuring customer satisfaction through auditing processes, implementing best practices and process improvements, and utilizing available frameworks and tools. Clear and concise articulation of goals and thoughts is necessary when communicating with clients, colleagues, subordinates, and supervisors, both verbally and in writing. As an Associate Process Manager, the responsibilities include executing and managing large-scale data transformation projects while ensuring adherence to timelines and objectives. Overseeing the organization, validation, and maintenance of product data across multiple platforms is also a key part of the role. Collaboration with cross-functional teams to streamline data workflows and enhance overall data quality is essential. Ensuring the delivery of Client Key Performance Indicators (KPIs), such as day-to-day service levels, customer experience, quality measures, and compliance measures, is a priority. Generating innovative ideas to increase productivity and quality, as well as setting and reviewing organizational/productivity objectives in line with commercial contracts, are additional responsibilities. Furthermore, the candidate should identify discrepancies and propose optimal solutions using logical, systematic, and sequential methodologies. Remaining open-minded towards inputs and views from team members, effectively leading, controlling, and motivating groups towards company objectives, is crucial. Being self-directed and proactive, seizing opportunities to meet internal and external customer needs and achieve customer satisfaction, is also important. Auditing processes, implementing best practices and process improvements, and utilizing available frameworks and tools to drive efficiency and effectiveness are part of the role. Qualifications and Education Requirements - Basic: Bachelor's in Mechanical/Electrical/Electronics with 8+ years of experience - Preferred: Experience in handling Master Data Management, Product data management/enrichment projects in any of the following industry domains - MRO, Core Electrical & Electronics, Plumbing, HVAC, Power Tools, Consumer Electronics, Office Stationary, and PET products Essential Skills - Excellent communication skills - verbal, written, etc. - Ability to work independently with minimal supervision as a self-starter - Strong skills in reporting and presentation of key business information - Experienced in managing mid-size (10+) multi-site/Multi skilled teams - Strong organizational skills - Excellent in analytical thinking and problem solving,
Posted 2 days ago
4.0 - 8.0 years
0 - 0 Lacs
pune, maharashtra
On-site
The Keyrus team is looking for a Data Engineer to contribute to a multi-year global Data program for a strategic Banking client. The target start date for this role is September 2025. You will be working in a hybrid model, with 3 days spent at either the Pune or Bangalore Office in India. The salary range for this position is between 25,00,000 to 35,00,000 INR. As a Data Engineer, your responsibilities will include designing, developing, and maintaining scalable ETL data pipelines using Python, SQL, and Databricks. You will be tasked with implementing ETL processes to extract, transform, and load data into the Snowflake data warehouse. Building modular data workflows connecting various data sources, loading them into data frames, and passing them into downstream Python ML models will be a key aspect of your role. Additionally, you will utilize Databricks Notebooks or Snowpark for developing, deploying, and orchestrating lightweight ML model scoring or transformations. Data cleansing and transformation within Python Notebooks before model execution will also fall under your purview. Managing source code using Git for reproducibility and collaboration is crucial. You will create and optimize data models, tables, and database structures within Snowflake, as well as implement database solutions to support analytical and reporting requirements. Optimizing data pipelines and SQL queries for performance, scalability, and efficiency will be part of your routine. Collaboration with data scientists, analysts, and stakeholders to understand data requirements and deliver solutions is essential. Adhering to data governance, security best practices, and version control will also be expected from you. We are seeking applicants with a minimum of 4 years of professional experience in Data Engineering or ETL development. Strong experience in building ETL pipelines on Snowflake and Databricks is necessary. Experience integrating Python-based ML models using Databricks or Snowpark notebooks is highly desirable. Hands-on skills in data cleansing, transformation, and lightweight model scoring using Python are important. Proficiency with Git for version control and collaboration is required. Working experience with Snowflake features like Streams, Tasks, DAG, Data Share, UDF, and Time Travel will be an advantage. Joining Keyrus means becoming part of a market leader in the Data Intelligence field and a prominent player in Management Consultancy and Digital Experience on an international scale. You will be a part of a dynamic and ever-learning enterprise with an established international network of thought-leading professionals committed to bridging the gap between innovation and business. Keyrus offers you the chance to meet specialized and professional consultants in a multicultural ecosystem. You will have the opportunity to showcase your talents and potential, gain experience through client interactions, and grow based on your capabilities and affinities, all within a great working and dynamic atmosphere. Keyrus UK Benefits include a competitive holiday allowance, a comprehensive Private Medical Plan, flexible working patterns, a Workplace Pension Scheme, Sodexo Lifestyle Benefits, a Discretionary Bonus Scheme, Referral Bonus Scheme, and Training & Development opportunities via KLX (Keyrus Learning Experience).,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As a Data Engineer at Guidehouse, you will have the opportunity to lead and execute data engineering projects, ensuring timely delivery and high quality. You will be responsible for building and optimizing data architectures for operational and analytical purposes, collaborating with cross-functional teams to gather and define data requirements. Additionally, you will implement data quality, data governance, and data security practices while managing and optimizing cloud-based data platforms such as Azure and AWS. Your role will involve developing and maintaining Python/PySpark libraries for data ingestion, processing, and integration with both internal and external data sources. You will design and optimize scalable data pipelines using Azure Data Factory and Spark (Databricks), working closely with stakeholders to address data-related technical issues and support their data infrastructure needs. As a mentor to junior data engineers, you will guide best practices in data engineering and evaluate and integrate new technologies and tools to improve data infrastructure. Ensuring compliance with data privacy regulations and monitoring performance across the data ecosystem will also be key responsibilities in this role. To qualify for this position, you should have a Bachelor's or Master's degree in computer science, information systems, statistics, math, engineering, or a related discipline. A minimum of 10+ years of hands-on experience in data engineering and cloud services is required, along with experience in leading and mentoring team members. Proficiency in Azure Data Factory, Databricks, Python, and PySpark is essential, as well as familiarity with modern data storage concepts like data lake and lake house. Experience in other cloud services such as AWS and data processing technologies will be advantageous, along with the ability to enhance, develop, and resolve defects in ETL processes using cloud services. Strong communication skills, problem-solving abilities, and a self-starter mindset are desirable traits for this role. Additionally, experience in different cloud providers, programming, and DevOps would be considered nice-to-have qualifications. Guidehouse offers a comprehensive total rewards package, including competitive compensation and a flexible benefits package to create a diverse and supportive workplace environment. Guidehouse is an Equal Opportunity Employer and will consider qualified applicants with criminal histories in accordance with applicable law. If you need accommodation during the application process, please contact Guidehouse Recruiting. Remember to be cautious of unauthorized correspondence related to job opportunities and report any suspicious activities to Guidehouse's Ethics Hotline. Your privacy and security are important to us.,
Posted 3 days ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Data Scientist in our team, you will be responsible for analyzing and interpreting data to identify trends and insights. Your role will involve developing and implementing data-driven solutions to address various business challenges. You will be expected to collect, clean, and transform data from multiple sources, as well as build and train machine learning models for prediction, classification, and other tasks. Communication is key in this role, as you will need to effectively communicate your findings and recommendations to stakeholders in a clear and concise manner. It is crucial to stay updated with the latest trends and technologies in the field of data science. You should also be capable of performing exploratory data analysis of raw data and conducting feature engineering when necessary. Proficiency in project tracking tools like JIRA or equivalent is required. Your competencies and skills should include excellent project management abilities encompassing planning, budgeting, resource allocation, and risk management. Proficiency in programming languages such as Python and R is essential. Basic knowledge of cloud platforms like AWS and Azure is preferred. Experience in forecasting using tools like SAP, Oracle, Power BI, and Qlik would be advantageous. Proficiency in Excel, including Power Pivot, Power Query, Macros, and Charts, is expected. Specific expertise for this role includes experience working in the automotive industry, particularly with quality assurance methodologies. Strong analytical and problem-solving skills are essential. You should be able to work both independently and collaboratively as part of a team. Excellent presentation and communication skills, both written and verbal, are necessary. The ability to problem-solve in an environment with unclear requirements and experience in Agile environments are also valuable assets for this position.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
punjab
On-site
Optimum Data Analytics is a strategic technology partner that specializes in delivering reliable turnkey AI solutions. With a streamlined approach to development, we ensure high-quality results and client satisfaction, bringing clarity and experience to organizations. Our diverse team comprising statisticians, computer science engineers, data scientists, and product managers is dedicated to powering every human decision with analytics & AI. We aim to revolutionize how AI/ML is embraced in the service sector and deliver impactful outcomes. We offer best-in-class services that not only increase profits for businesses but also enhance value for customers. Our solutions help businesses grow, transform, and achieve their objectives effectively. Position: Data Engineer (Azure) Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali **Must Have:** - Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps - ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL - Data Transformation, Modeling, Ingestion, and Integration - Microsoft Certified: Azure Data Engineer Associate **Required Skills And Experiences:** - 5-8 years of experience as a Data Engineer focusing on Azure cloud services - Bachelor's degree in computer science, Information Technology, or related field - Strong hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and Azure Storage - Proficient in SQL, including data modeling, complex queries, and performance optimization - Ability to work independently, managing multiple tasks simultaneously - Familiarity with version control systems (e.g., Git) and CI/CD pipelines (Azure DevOps) - Knowledge of Data Lake Architecture, Data Warehousing, and Data Modeling principles - Experience with RESTful APIs, Data APIs, and event-driven architecture - Understanding of data governance, lineage, security, and privacy best practices - Strong problem-solving, communication, and collaboration skills If you possess the required skills and experiences mentioned above, we encourage you to apply for the position of Data Engineer (Azure) with us to be a part of our dynamic team at Optimum Data Analytics.,
Posted 3 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As the Chief Architect AI at Honeywell Technology Solutions (HTS), you will play a crucial role in driving breakthrough advancements in software development, system engineering, and emerging technologies such as artificial intelligence (AI), machine learning (ML), and cybersecurity. With a global team of over 5,500 engineers spread across multiple countries, you will lead the development, articulation, and implementation of a comprehensive AI/ML/Data Transformation Roadmap aligned with the overall business objectives of HTS. Your key responsibilities will include providing strategic leadership and direction in AI projects, driving performance output to meet KPI metrics, reviewing and prioritizing projects, collaborating with external partners and industry leaders, staying updated with the latest advancements in AI/ML, and enabling productivity gains across functions with the usage of AI technology tools. Additionally, you will work with engineering teams for cost optimization and cycle time reduction, evaluate AI tools/technologies/frameworks, and drive cross-SBG and virtual community on AI adoption and capability building for HTS. To be successful in this role, you must have proven experience and technology leadership in implementing AI/ML technologies for successful product and project execution. A Doctorate Degree (PhD) in Engineering (Electrical, Computer Science, Aerospace, Mechanical Engineering) and 10 years of experience in innovation and leadership of technology development are highly valued. Experience in developing technology or products utilizing ML and AI, ability to establish a strategy and execution plan in an ambiguous environment, hands-on technical experience in AI/ML/Complex Problem Solving, and strong technology development practices are also important. Furthermore, your ability to lead by example, collaborate effectively, maintain a big picture perspective, demonstrate personal accountability, operate with energy and enthusiasm, display personal courage and commitment, and empower others will be crucial in driving innovation, agility, and speed within the organization. Join Honeywell Technology Solutions and be a part of a global engineering team that is recognized for innovation, execution excellence, and technical leadership in delivering intuitive, reliable, and future-ready solutions across various business segments. Honeywell is committed to helping organizations solve complex challenges in automation, aviation, and energy transition, making the world smarter, safer, and more sustainable through actionable solutions and innovation.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly skilled and experienced Senior Tableau Developer to be a part of our expanding data visualization team. You should possess a deep expertise in Tableau Desktop and Server, along with a solid analytical mindset to develop engaging and insightful dashboards and visualizations that effectively present statistical data. Your role will be pivotal in converting intricate data into actionable insights for business stakeholders. Your responsibilities will include designing, developing, and maintaining interactive Tableau dashboards and visualizations, integrating appropriate statistical displays. You will be required to connect to various data sources like databases, spreadsheets, and cloud platforms, perform data modeling and transformation within Tableau, and create complex calculations and fields for dynamic statistical analysis. Additionally, collaborating with business stakeholders to gather requirements and translating them into effective visualizations will be essential. You should be well-versed in selecting and implementing suitable statistical methods and visualizations within Tableau, providing training and support to end-users, and keeping abreast of the latest Tableau features and best practices. Contributing to the development of data visualization standards and guidelines, optimizing Tableau dashboards, troubleshooting Tableau-related issues, and mentoring junior developers will also be part of your role. Qualifications: - Bachelor's degree in Computer Science, Information Systems, Statistics, or related field. - 5+ years of experience in developing Tableau dashboards with a focus on statistical data representation. - Advanced proficiency in Tableau Desktop and Server, statistical functions, and chart types. - Strong understanding of statistical concepts, data warehousing, and relational databases. - Experience in data modeling, ETL processes, and excellent analytical skills. - Ability to effectively communicate technical concepts to non-technical audiences. - Experience with other data visualization tools like Power BI, Qlik Sense is a plus. - Tableau certification is preferred. If you require a reasonable accommodation due to a disability to use our search tools or apply for a career opportunity, please review Accessibility at Citi.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
You will require a minimum of 4 years of experience in Robot Framework with Python for this position. Your responsibilities will include working with libraries such as pandas, openpyxl, numpy, boto3, json, pypyodbc, and sqlite3. You will need to have hands-on experience in building keywords and test cases using Robot Framework. Additionally, you should be familiar with DevOps activities, pipeline design, and architecture. Your role will involve implementing data transformation techniques like filtering, aggregation, enrichment, and normalization. You will also be responsible for configuring and deploying pipelines, managing failures, and tracking pipeline execution through logging practices. Experience in GitHub activities and version control concepts is essential. You must have a good understanding of Git commands, branching and merging strategies, remote repositories (e.g., GitHub, GitLab, Bitbucket), and troubleshooting techniques. Moreover, a solid knowledge of AWS, specifically AWS S3 files upload and download functionality, is required for this position.,
Posted 4 days ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As an experienced Data Engineer, you will be responsible for handling data transformation and ETL processes on large datasets. With over 10 years of experience in this field, you will design customer-centric datasets including CRM, Call Center, Marketing, Offline, and Point of Sale data. Your expertise in Data Modeling, including Relational, Dimensional, Columnar, and Big Data models, with 5+ years of experience will be crucial for this role. Your proficiency in complex SQL or NoSQL, along with advanced Data Warehouse concepts, will be essential in ensuring efficient data processing. Experience with industry-standard ETL tools such as Informatica and Unifi will be beneficial. You will also be involved in defining business requirements, structured analysis, process design, and use case documentation. Your experience with Reporting Technologies like Tableau and PowerBI, coupled with professional software development skills, will be utilized in delivering high-quality solutions. Strong organizational skills and the ability to handle multiple customer projects simultaneously are key requirements for this position. Effective verbal and written communication skills are necessary as you will be interacting with the Sales team and leading customers towards successful outcomes. Being self-managed, proactive, and customer-focused are qualities that will be highly valued in this role. A degree in Computer Science, Information Systems, Data Science, or related field is required. Special consideration will be given if you have experience and knowledge of Adobe Experience Cloud solutions, Digital Analytics, Digital Marketing, programming languages like Python, Java, or Bash scripting, and Big Data technologies such as Hadoop, Spark, Redshift, Snowflake, Hive, and Pig. Previous experience as an enterprise technical or engineer consultant will be an advantage in this position.,
Posted 4 days ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
You are the leading provider of professional services to the middle market globally, with a purpose to instill confidence in a world of change, empowering your clients and people to realize their full potential. Your exceptional team is the key to your unrivaled, inclusive culture and talent experience, making you compelling to your clients. You will find an environment that inspires and empowers you to thrive both personally and professionally. There is no one like you and that's why there's nowhere like RSM. You are looking for an experienced Hands-On Technical Manager with expertise in big data technologies and multi-cloud platforms to lead your technical team for the financial services industry. The ideal candidate will possess a strong background in big data architecture, cloud computing, and a deep understanding of the financial services industry. As a Technical Manager, you will be responsible for leading technical projects, hands-on development, delivery management, and sales, ensuring the successful implementation of data solutions across multiple cloud platforms. This role requires a unique blend of technical proficiency, sales acumen, and presales experience to drive business growth and deliver innovative data solutions to your clients. Responsibilities: - Provide technical expertise and guidance on the selection, hands-on implementation, and optimization of big data platforms, tools, and technologies across multiple cloud environments (e.g., AWS, Azure, GCP, Snowflake, etc.). - Architect and build scalable and secure data pipelines, data lakes, and data warehouses to support the storage, processing, and analysis of large volumes of structured and unstructured data. - Lead and mentor a team of technical professionals in the design, development, and implementation of big data solutions and data analytics projects within the financial services domain. - Stay abreast of emerging trends, technologies, and industry developments in big data, cloud computing, and financial services, assessing their potential impact on the organization. - Develop and maintain best practices, standards, and guidelines for data management, data governance, and data security in alignment with regulatory requirements and industry standards. - Collaborate with the sales and business development teams to identify customer needs, develop solution proposals, and present technical demonstrations and presentations to prospective clients. - Collaborate with cross-functional teams including data scientists, engineers, business analysts, and stakeholders to define project requirements, objectives, and timelines. Basic Qualifications: - Bachelor's degree or higher in Computer Science, Information Technology, Business Administration, Engineering, or related field. - Minimum of ten years of overall technical experience in solution architecture, design, hands-on development with a focus on big data technologies, multi-cloud platforms, and at least 5 years of experience specifically in financial services. - Strong understanding of the financial services industry - capital markets, retail and business banking, asset management, insurance, etc. - In-depth knowledge of big data technologies such as Hadoop, Spark, Kafka, and cloud platforms such as AWS, Azure, GCP, Snowflake, Databricks, etc. - Experience with SQL, Python, Pyspark, or other programming languages used for data transformation, analysis, and automation. - Excellent communication, presentation, and interpersonal skills, with the ability to articulate technical concepts to both technical and non-technical audiences. - Hands-on experience extracting (ETL using CDC, Transaction Logs, Incremental) and processing large data sets for Streaming and Batch data loads. - Ability to work from the Bengaluru/Hyderabad, India office at least twice a week. Preferred Qualifications: - Professional certifications in cloud computing (e.g., AWS Certified Solutions Architect, Microsoft Certified Azure Solutions Architect, Azure Data Engineer, SnowPro Core) and/or big data technologies. - Experience with Power BI, Tableau, or other Reporting and Data Visualization tools. - Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code tools. Education/Experience: - Bachelor's degree in MIS, CS, Engineering, or equivalent field. - Master's degree in CS or MBA is preferred. - Advanced Data and Cloud Certifications are a plus. At RSM, a competitive benefits and compensation package is offered for all employees. There is flexibility in your schedule, empowering you to balance life's demands while also serving clients. Learn more about the total rewards at https://rsmus.com/careers/india.html. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process and/or employment/partnership. RSM is committed to providing equal opportunity and reasonable accommodation for people with disabilities. If you require a reasonable accommodation to complete an application, interview, or otherwise participate in the recruiting process, please send an email to careers@rsmus.com.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
nagpur, maharashtra
On-site
As a Power BI Developer, you will be responsible for understanding business requirements in the BI context and designing data models to transform raw data into meaningful insights. You will create dashboards and interactive visual reports using Power BI, identifying key performance indicators (KPIs) and consistently monitoring them. Your role will involve analyzing data and presenting it through reports that aid decision-making. Additionally, you will be converting business requirements into technical specifications, creating relationships between data, and developing tabular and multidimensional data models. Chart creation and data documentation explaining algorithms, parameters, models, and relations will also be part of your responsibilities. To excel in this role, you should possess a Bachelor's degree in Computer Science, Business Administration, or a related field, along with a minimum of 6 to 8 years of experience in visual reporting development. You must have at least 6 years of Power BI development experience, expertise in SQL Server, and excellent Microsoft Office skills, including advanced Excel skills. Strong analytical, quantitative, problem-solving, and organizational skills are essential, along with attention to detail and the ability to coordinate multiple tasks, set priorities, and meet deadlines. Apply now for the Power BI Developer position in Nagpur/Pune if you are passionate about creating impactful data visualizations and driving insights through analytics. If you are an experienced professional with 6 to 8 years in the field of Business Intelligence, consider applying for the Power BI Lead role in Nagpur/Pune. You will be responsible for understanding business requirements, creating dashboards and visual reports, identifying key KPIs, and analyzing data to aid decision-making. Data cleansing, data quality processes, and developing data models will be key aspects of your responsibilities. Your skills in Analyzes Service, building Tabular & Multidimensional models, and Power BI development experience will be crucial for success in this role. For those with 8+ years of experience in Business Intelligence and a proven track record as a Power BI Architect, we have an exciting opportunity in Nagpur/Pune. As a Power BI Architect, your responsibilities will include collaborating with business stakeholders to understand reporting and analytics requirements, designing end-to-end Power BI solutions, developing data integration pipelines, and creating visually appealing reports and dashboards. Performance optimization and enhancing user experiences will be key focus areas in this role. If you are passionate about innovation, growth, and high-impact careers, and possess the required skills and experience, we invite you to apply for the Power BI Architect position and be part of a dynamic team that thrives on learning and development opportunities. Join us in creating a collaborative work environment that fosters growth and success for all team members.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
haryana
On-site
As an Assistant Vice President, Data Engineering Expert at Analytics & Information Management (AIM) in Gurugram, you will play a crucial role in leading the Data/Information Management Team. Your responsibilities will include driving the development and implementation of data analytics solutions to support key business objectives for Legal Operations as part of the COO (Chief Operating Office). You will be expected to build and manage high-performing teams, deliver impactful insights, and foster a data-driven culture within the organization. In this role, you will be responsible for supporting Business Execution, Legal Data & Reporting activities for the Chief Operating Office by implementing data engineering solutions to manage banking operations. This will involve establishing monitoring routines, scorecards, and escalation workflows, as well as overseeing Data Strategy, Smart Automation, Insight Generation, Data Quality, and Reporting activities using proven analytical techniques. Additionally, you will be required to enable proactive issue detection, implement a governance framework, and interface between business and technology partners for digitizing data collection. You will also need to communicate findings and recommendations to senior management, stay updated with the latest trends in analytics, ensure compliance with data governance policies, and set up a governance operating framework to enable operationalization of data domains. To excel in this role, you should have at least 8 years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like Python, PySpark, Tableau, MicroStrategy, and SQL. Strong understanding of Data Transformation, Data Strategy, Data Architecture, Data Tracing & Lineage, and Database Management & Optimization will be essential. Additionally, experience in AI solutions, banking operations, and regulatory requirements related to data privacy and security will be beneficial. A Bachelor's/University degree in STEM is required for this position, with a Master's degree being preferred. Your ability to work as a senior member in a team of data engineering professionals and effectively manage end-to-end conceptualization & implementation of data strategies will be critical for success in this role. If you are excited about the opportunity to lead a dynamic Data/Information Management Team and drive impactful insights through data analytics solutions, we encourage you to apply for this position and be a part of our talented team at AIM, Gurugram.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
haryana
On-site
At PwC, our team in audit and assurance focuses on providing independent and objective assessments of financial statements, internal controls, and other assurable information to enhance credibility and reliability for stakeholders. We evaluate compliance with regulations, assess governance and risk management processes, and related controls. As part of the data, analytics, and technology solutions team, you will assist clients in developing solutions that build trust, drive improvement, and detect, monitor, and predict risks. Your work will involve utilizing advanced analytics, data wrangling technology, and automation tools to leverage data and establish the right processes for clients to make efficient decisions based on accurate and trustworthy information. You are expected to be driven by curiosity and be a reliable team member in a fast-paced environment. Working with various clients and team members will present different challenges and scope, providing opportunities for learning and growth. Taking ownership and consistently delivering quality work that adds value for clients and contributes to team success is crucial. Building a personal brand within the firm will open doors to more opportunities for you. As an Associate, your responsibilities include designing and developing ways to automate and reimagine audits, implementing innovative technologies such as Alteryx, SQL, Python, Power BI, and PowerApps. You will develop a strong understanding of the role of data and analytics in modern audits and work on technical assignments to enhance skills in data analytics and visualization. Client engagements, data management, analytics and reporting, advanced analytics, and building relationships with engagement teams and clients are key aspects of your day-to-day responsibilities. Preferred qualifications for this role include a Bachelor's or Master's degree in Computer Science, Data Analytics, or Accounting with a minimum of 1 year of relevant experience. Candidates with Big 4 or equivalent experience are preferred. Essential skills required include market credentials in data & analytics, stakeholder management, project management, analytical and problem-solving capabilities, and a long-term career ambition at PwC. Desirable skills include finance process knowledge, audit experience, use of technology in data & analytics, and experience working in financial reporting, financial accounting, regulatory compliance, or internal audit. Technical skills needed for this role encompass data transformation and modeling, data storage and querying, data visualization, understanding data quality issues, data cleansing, robotics, finance/accounting understanding, and knowledge of current data science software platforms.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as an Analyst- Finance Transformation at Barclays, where you will be involved in functional design, data, end-to-end-process and controls, delivery, and functional testing. Spearheading the evolution of Barclays" digital landscape, you will drive innovation and excellence harnessing cutting-edge technology to revolutionize digital offerings for unparalleled customer experiences. To be successful in this role, you should have the ability to support development data transformation workflows leveraging Alteryx, Teradata, or any SQL database. You must also be adept at supporting development in business intelligence tools like Tableau, SAC, etc. Providing design solutions for internal reporting problem statements and business requirements with quick delivery using tactical solutions, and connecting with the strategic roadmap is essential. Acting as a Business Analyst, you will support the function from a strategic viewpoint, delivering MI views that enable analytics and support quick decision-making. Supporting the business on an agile basis in delivering critical requirements in a dev ops model is a key aspect. Building innovative dashboards on a sprint basis with a key focus on controls and governance structure, as well as visually enhancing an analytical view from the legacy excel/PPT model are crucial responsibilities. Adherence to all IR Controls and developing and implementing robust controls mechanisms in all managed processes is a key requirement. Highly valued skills may include knowledge in data transformation tools like Alteryx, SQL databases, and business intelligence platforms like Tableau with data management experience. Additionally, experience in designing MI dashboards and insights, along with broad business and industry knowledge and experience, are beneficial. The role will be based out of Chennai. Purpose of the role: To develop business capabilities for Finance through key stages of functional design, data, end-to-end-process, controls, delivery, and functional testing. Accountabilities: Functional Design: Leveraging best practice concepts, supporting options analysis, and recommendations in collaboration with Line SMEs. Data Analysis/Modelling/Governance: Designing a conceptual data model, governance requirements, and aligning with GDMS standards and principles. End-to-End Process & Controls: Developing target process, controls design/documentation, and aligning with organizational and role/service model design definitions. Delivery/Implementation Support: Updating design/functional requirements, resolving RAIDS, and project management for change programs. Functional Testing: Developing scripts and data for testing alignment to requirement definitions. Analyst Expectations: Performing activities in a timely and high standard consistently, driving continuous improvement. Requires in-depth technical knowledge and experience in the assigned area of expertise. Leading and supervising a team, guiding professional development, and coordinating resources. If the position has leadership responsibilities, demonstrating leadership behaviours to create an environment for colleagues to thrive. OR for an individual contributor, developing technical expertise and acting as an advisor. Partnering with other functions and business areas, taking responsibility for end results of operational processing and activities. Escalating breaches of policies/procedures, advising and influencing decision-making. Taking ownership for managing risk, strengthening controls, and delivering work in line with rules, regulations, and codes of conduct. Demonstrating understanding of how own sub-function integrates with function and organization's products, services, and processes. Resolving problems, guiding team members, and communicating complex/sensitive information. Acting as a contact point for stakeholders, building a network of contacts outside the team and external to the organization. All colleagues are expected to demonstrate Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship and the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
As a Data Warehouse Developer, you will play a crucial role in designing, building, and enhancing our client's online platform. Your responsibilities will include researching, suggesting, and implementing new technology solutions in line with best practices and standards. You will also be accountable for ensuring the resiliency and availability of various products while actively contributing to the team's productivity. Your expertise should encompass over 7 years of practical experience in designing and constructing functions and stored procedures using Oracle Data Integrator (ODI). You will be tasked with creating data warehouse schemas, including fact and dimension tables, and documenting them comprehensively. Collaborating with DBAs, you will develop and execute table creation scripts, analyze diverse data sources to establish data relationships as per Business Requirements Documents (BRDs), and possess a deep understanding of data quality, ETL/ELT processes, and common transformation patterns. Furthermore, your role will involve designing and implementing ELT workflows to load data from source systems into staging environments, and subsequently into target models leveraging ODI. Conducting data validation using advanced SQL queries and data profiling techniques will be a key aspect of your responsibilities. Demonstrating a solid grasp of data governance concepts, tools, and best practices, you will be adept at data quality analysis, including assessing accuracy, completeness, and consistency through query composition. Your skill set should encompass strong analytical capabilities, effective research skills, and adept problem-solving abilities. Excellent written and verbal communication skills are essential, along with the flexibility to work in rotational shifts. In return, you can look forward to working in a challenging and innovative environment that offers ample opportunities for learning and growth as needed.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
ahmedabad, gujarat
On-site
You will play a key role as a SAS Visual Analytics (VA) Developer within our team, where you will be tasked with crafting interactive dashboards, generating reports, and harnessing the power of SAS Visual Analytics to deliver impactful data visualization and actionable insights. Your primary focus will be on collaborating with stakeholders to produce user-friendly reports and dashboards that drive data-informed decision-making. Your responsibilities will include designing and implementing dynamic SAS VA reports and dashboards that align with business needs, creating engaging data visualizations that empower users to delve into insights effortlessly, utilizing SAS programming for data manipulation, analysis, and customization of reporting solutions, fostering collaboration with business analysts and cross-functional teams to ensure the deliverables meet organizational requirements, optimizing performance of reports and dashboards for scalability and ease of use, and providing continuous training and support to end-users for efficient utilization of reports. To excel in this role, you must possess a strong background in SAS Visual Analytics for developing reports and dashboards, proficiency in SAS programming for data transformation and crafting custom reports, a deep understanding of data visualization techniques and industry best practices, as well as exceptional communication and teamwork skills.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are not the person who settles for just any role. Neither are we. Because we are out to create Better Care for a Better World, and that takes a certain kind of person and teams who care about making a difference. Here, you will bring your professional expertise, talent, and drive to building and managing our portfolio of iconic, ground-breaking brands. In this role, you will help deliver better care for billions of people around the world. It all starts with YOU. As a Data Transformation Senior Consultant, your primary responsibility will be the transformation of Global People data from various sources into the format required by downstream systems and our Workday HRIS. You will oversee the Workday EIB (ETL) process. In this role, you will: - Extract data from multiple sources such as databases, applications, and flat files. - Map data to define how information from different sources should be matched and transformed to align with the target structure. - Apply various manipulation techniques like aggregation, filtering, calculation, and normalization to structure the data appropriately. - Validate data to ensure quality and accuracy post transformations. - Integrate data from multiple sources into a unified dataset. - Standardize data by applying consistent formatting and data types across the dataset. - Document transformation processes meticulously to maintain records of transformation logic, mappings, and data lineage for traceability and troubleshooting. - Develop transformation workflows using ETL (Extract, Transform, Load) tools and principles, including Workday EIBs. - Identify and resolve performance bottlenecks within data transformation processes to optimize performance. Huggies. Kleenex. Cottonelle. Scott. Kotex. Poise. Depend. Kimberly-Clark Professional. You are likely familiar with our legendary brands, as millions of people use Kimberly-Clark products every day. At Kimberly-Clark, you will be part of a team committed to driving innovation, growth, and impact. The company has a rich history of over 150 years of market leadership and is constantly seeking new and improved ways to excel. This is your opportunity at Kimberly-Clark. You perform at the highest possible level and thrive in a performance culture driven by genuine care. You want to be part of a company that is actively committed to sustainability, inclusion, wellbeing, and career development. You find purpose in your work, especially when it makes a difference. Kimberly-Clark is dedicated to exploring new ideas on achieving results effectively. With Flex That Works, you will have flexible work arrangements that empower you to have meaningful time in the office while collaborating with your leader to make flexibility work for both you and the business. In technical roles at Kimberly-Clark, the focus is on winning with consumers and the market while prioritizing safety, mutual respect, and human dignity. To excel in this role, the following qualifications are required: - 5+ years of experience in hands-on data transformations using ETL tools, specifically Workday EIBs. - Demonstrated problem-solving skills in identifying and resolving data transformation challenges. - Effective communication of data mapping, transformation, structure, critical requirements, and findings to stakeholders. - Proven data modeling ability by developing data structures and relationships. - Knowledge of Power BI, Power Query, and SQL with the ability to create data visualizations. To be considered for this role, click the Apply button to complete the online application process. Our recruiting team will review your application and follow up if you are a great fit for the position. Please visit the careers website for more information. *The statements above describe the general nature and level of work performed by employees in this classification. They are not intended to be an exhaustive list of all duties, responsibilities, and skills required for this position.* Employment is subject to verification of pre-screening tests, including drug screening, background check, and DMV check. **Primary Location:** No KC Work Site - India **Additional Locations:** Not specified **Worker Type:** Employee **Worker Sub-Type:** Regular **Time Type:** Full time,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a SnapLogic Professional at YASH Technologies, you will be a key member of our team, utilizing your 6-8 years of experience to bring real positive changes in an increasingly virtual world. Your primary responsibilities will revolve around SnapLogic Pipeline Development, with a focus on data analysis, ETL job migration, platform moderation, and cloud exposure on AWS. To excel in this role, you must have a minimum of 3 years of hands-on experience in SnapLogic Pipeline Development, along with strong debugging skills. It is essential to be proficient in SQL, PL/SQL, and RDBMS, as well as have a background in ETL Tools like DataStage and Informatica, with a focus on data quality. Additionally, experience with SnapLogic developer certification and Snowflake is highly desirable. Your day-to-day tasks will involve configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. You will be responsible for designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Furthermore, you will collaborate with business partners to provide long-lasting solutions and stay updated with the latest SnapLogic features and best practices. At YASH Technologies, we offer you an inclusive team environment where you are empowered to create a career that aligns with your goals. Our Hyperlearning workplace is built on the principles of flexible work arrangements, free spirit, emotional positivity, agile self-determination, trust, transparency, open collaboration, and all the support needed for the realization of business goals. Join us for stable employment in a great atmosphere with an ethical corporate culture.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As the GIS Tech lead, you will be responsible for leading our FME and GIS analyst team. You will provide technical guidance on complex projects, design FME workbenches and workflows, and ensure data quality. Your role involves overseeing the design, development, and maintenance of FME workbenches for efficient data processing. You will also lead the development and review of Python scripts to enhance FME workflows and functionalities. In this position, you will mentor and guide analysts in effectively utilizing ArcGIS tools and techniques for data management, analysis, and visualization. You will review FME errors and ArcGIS model schema issues, identifying and correcting critical errors to ensure data quality and integrity. Implementing robust QA/QC procedures will be crucial to ensure all data products meet the highest standards of accuracy and completeness. Your responsibilities will also include managing and prioritizing project tasks, assigning workloads to analysts based on skillsets and project requirements. Tracking project progress and ensuring timely delivery of data products within established deadlines will be essential. Collaboration with internal stakeholders to understand data needs and develop optimal solutions is a key aspect of this role. To stay up to date with the latest advancements in FME, ArcGIS, and geospatial technologies, you will need to continuously learn and adapt. Documenting best practices, workflows, and procedures for knowledge sharing and future reference will be important for the team's success. The ideal candidate for this position will have a minimum of 8+ years of hands-on experience in performing spatial analysis and delivering solutions using ArcGIS and FME. Proven experience in leading and mentoring a team of GIS and FME analysts is required. In-depth knowledge of FME workbenches, data transformation techniques, and Python scripting for automation is essential. A strong understanding of ArcGIS Desktop applications and geospatial data models is necessary. Experience in designing and implementing robust QA/QC procedures for geospatial data, excellent project management skills, solid problem-solving and analytical skills, effective communication and collaboration skills, and experience in Agile development methodologies will all be valuable assets in this role.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. The Senior Software Developer plays a crucial role in collaborating with the Product Manager, Implementation Consultants (ICs), and clients to understand requirements for meeting data analysis needs. This position requires good collaboration skills to provide guidance on analytics aspects to the team in various analytics-related activities. Experienced in Qlik Sense Architecture design and proficient in load script implementation and best practices. Hands-on experience in Qlik Sense development, dashboarding, data-modelling, and reporting techniques. Skilled in data integration through ETL processes from various sources and adept at data transformation including the creation of QVD files and set analysis. Capable of data modeling using Dimensional Modelling, Star schema, and Snowflake schema. The Senior Software Developer should possess strong SQL skills, particularly in SQL Server, to validate Qlik Sense dashboards and work on internal applications. Knowledge of deploying Qlik Sense applications using Qlik Management Console (QMC) is advantageous. Responsibilities include working with ICs, product managers, and clients to gather requirements, configuration, migration, and support of Qlik Sense applications, implementation of best practices, and staying updated on new technologies. Candidates for this role should hold a Bachelor of Science / BTech / MTech / Master of Science degree in Computer Science or equivalent work experience. Effective verbal and written communication skills are essential. Additionally, candidates are required to have a minimum of 3 - 5 years of experience in implementing end-to-end business intelligence using Qlik Sense, with thorough knowledge of Qlik Sense architecture, design, development, testing, and deployment processes. Understanding of Qlik Sense best practices, relational database concepts, data modeling, SQL code writing, and ETL procedures is crucial. Technical expertise in Qlik Sense, SQL Server, data modeling, and experience with clinical trial data and SDTM standards is beneficial. This position offers the opportunity to accelerate skills and career growth within a fast-growing company while contributing to the future of healthcare. eClinical Solutions fosters an inclusive culture that values diversity and encourages continuous learning and improvement. The company is an equal opportunity employer committed to making employment decisions based on qualifications, merit, culture fit, and business needs.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
At Capgemini Engineering, the global leader in engineering services, a team of engineers, scientists, and architects collaborates to support the world's most innovative companies in realizing their full potential. From cutting-edge technologies such as autonomous vehicles to life-saving robotics, our digital and software technology experts showcase creativity by offering unique R&D and engineering services across diverse industries. Join us for a rewarding career filled with endless opportunities where your contributions can truly make a difference, and each day brings fresh challenges. Design, develop, and maintain ETL processes utilizing Pentaho Data Integration (Kettle) to extract data from various sources like databases, flat files, APIs, and cloud platforms. Transform and cleanse data to align with business and technical requirements before loading it into data warehouses, data lakes, or other designated systems. Monitor and optimize ETL performance while troubleshooting any arising issues. Collaborate closely with data architects, analysts, and business stakeholders to comprehend data requirements. Uphold data quality, integrity, and security throughout the ETL lifecycle while documenting ETL processes, data flows, and technical specifications. For the Industrial Operations Engineering focus, develop expertise in the designated area. Share knowledge and provide guidance to peers, interpreting clients" needs effectively. Execute assigned tasks independently or with minimal supervision, identifying and resolving problems in straightforward scenarios. Contribute actively to teamwork and engage with customers to deliver value. Capgemini is a prominent global partner in business and technology transformation, aiding organizations in accelerating their journey towards a digital and sustainable future. With a diverse and responsible workforce of 340,000 members across 50+ countries, Capgemini leverages its 55+ years of experience to help clients harness technology's value comprehensively. Offering end-to-end services and solutions spanning from strategy and design to engineering, Capgemini excels in AI, generative AI, cloud, and data capabilities, supported by deep industry knowledge and a robust partner ecosystem.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As an Analyst at JLL Business Services (JBS) Workforce Management (WFM) program, you will play a crucial role in supporting WFM activities and system maintenance. Your primary responsibilities will include conducting data analysis, forecasting, and capacity planning across JBS. You will collaborate with the technical product owner and vendors to maintain and execute change requests for the workforce management platform. Reporting to the Senior Director overseeing the WFM program and Performance Coaching professionals, you will contribute to ensuring the right number of skilled resources are available to handle accurately forecasted workloads that deliver quality outcomes. Your day-to-day tasks will involve obtaining and validating historical data for forecasting, updating and maintaining capacity planners, providing analysis for staffing efficiencies, and developing clear reports and data visualizations for operations. Additionally, you will act as the system admin for the WFM platform, maintain comprehensive documentation, monitor program adherence, provide training and coaching, and serve as a Subject Matter Expert in WFM for Operations. Ideal candidates for this position will have 4-7 years of relevant work experience in workforce management or data analytics, along with a Bachelor's degree in a related field. Proficiency in Excel, MS SQL, and business intelligence platforms like Tableau and Power BI is required. Strong analytical skills, problem-solving abilities, written and verbal communication skills, attention to detail, and the capability to manage multiple projects simultaneously are essential. Experience in a global company working across cultures is preferred. The estimated compensation for this position will be based on the market range for the role and location, offering a supportive culture and comprehensive benefits package prioritizing mental, physical, and emotional health. The hybrid work model based on the JBS Flex program allows for 2-4 days in the office, with more during the Onboarding period. If you resonate with this job description and possess most of the requirements, we encourage you to apply. JLL is dedicated to creating a diverse and inclusive culture where all individuals feel welcomed, valued, and empowered to achieve their full potential.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has seen a significant rise in the demand for data transformation professionals in recent years. With the increasing importance of data in business decision-making, companies across various industries are actively seeking skilled individuals who can transform raw data into valuable insights. If you are considering a career in data transformation in India, here is a comprehensive guide to help you navigate the job market.
These cities are known for their thriving tech industries and have a high demand for data transformation professionals.
The average salary range for data transformation professionals in India varies based on experience levels. Entry-level positions typically start at INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
A typical career path in data transformation may include roles such as Data Analyst, Data Engineer, Data Scientist, and Data Architect. As professionals gain experience and expertise, they may progress to roles like Senior Data Scientist, Lead Data Engineer, and Chief Data Officer.
In addition to data transformation skills, professionals in this field are often expected to have knowledge of programming languages (such as Python, R, or SQL), data visualization tools (like Tableau or Power BI), statistical analysis, and machine learning techniques.
As the demand for data transformation professionals continues to rise in India, now is a great time to explore opportunities in this field. By honing your skills, gaining relevant experience, and preparing for interviews, you can position yourself for a successful career in data transformation. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France