Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
8 - 11 Lacs
Pune
Work from Office
Job purpose The Service Contract Support Specialist will have ownership of a designated service contract base and will be responsible for driving the renewal process for those contracts with business stakeholders. They will also have ownership for any changes occurring within the contract lifecycle, parts updates, change orders, cancellations, monitoring data quality. The main objectives are Create pricing and documentation for assigned contract base, accurately and on time, to ensure proposals are sent to business stakeholders on time, following GSM Processes. Drive contract renewals correctly in CRM with SOLs, with all data and required documentation, to ensure no delays/errors are incurred during booking Ownership for any changes occurring within the contract lifecycle, parts updates, change orders, cancellations, monitoring data quality. This will include the first year of newly booked contracts, which have been handed over from Contract Proposal Team Primary responsibilitie Develop a clear understanding of assigned contract base Develop and maintain good working relationships with key personnel within relevant Honeywell teams, including Service Operation Leaders, Field Service Managers and any other supporting functions (GCC, ISLC, CPT) Proactively drive assigned renewals with self and other stakeholders, to adhere to renewal tasks due dates and RNOC given to SOL SLA s Maintain accurate and timely information in CRM for renewals, including attaching documentation for all stages of the renewal process Update opportunity Next Step comments weekly for all renewals in progress and against a CSS renewal milestone Provides accurate updates of each contract renewal and any issues, during weekly MOS call with Service Contract Support Pole Lead Escalate issues in a timely manner to Service Contract Support Pole Lead, which may delay renewal process - do not wait for next MOS call Maintains good knowledge of the renewal process SOP and Work Instructions Ensures that a renewal opportunity exists and is linked to any renewal case/PSC in progress and is also linked to the service contract in CRM Identifies scope for renewal of designated service contract base and works with Service Operations Leader to validate that scope during weekly MOS with SOL Ensure renewal case is created for each active renewal entitlement in CRM Prices scope accurately and obtains proposals from other depts (Cyber, HCP, Third Party), when needed for inclusion in pricing tool Ensure pricing matches between pricing tool and PSC Obtains financial approval for all renewals before issuing the proposal to Service Operations Leader Creates accurate proposal and/or other documentation for the Service Operations Leader When customer PO is received, check details on PO vs Pricing tools and proposal, including sold-to party, payment terms, invoicing frequency Create accurate and complete booking package to handover renewal for financial booking in CRM and ERP and follow on activities (critical spare parts setup, third vendor purchase orders, SOFs and any other special instructions. ) Continuously learns renewal process, pricing tools and CRM to identify possible improvement areas within the renewal process/tools Create and issue Welcome Packet to SOL within 7 days of contract booking (excluding exceptions) Takes part in tools Dev and UAT when needed, to support enhancements and to continuously learn new functionality Cover absences for CSS colleagues as and when needed, to keep renewals moving forward Ensure in progress work is handed over to CSS backup when having planned leave Be involved with the training of new employees, including buddy system for support with live renewals Agree deadlines for tasks/actions required by other stakeholders and keep track of those actions/deadlines/owners via CRM or RAIL Continually develop own knowledge and skills to support current role and career path Ensure any changes made to VRW asset list during booking, must be communicated back to the Asset Support Team, to ensure correct data alignment Contact Service Contract Pole Lead as first point of contact on any issues or questions Proactively drive own IDP, goals and KPIs to meet targets Hold quarterly meetings with Direct Manager to drive own Individual Development Plan Use dashboard available in SF and Power BI to drive renewal tasks to on time completion Drive CSS pricing with SOL, so that local pricing is not used, excluding agreed countries. Support standardization in Contract Renewal process by developing reusable standard documents like Standard Operating Procedures (SOP), Self Learning Packages (SLP), Checklists, guidelines, etc. Provide technical guidance to other team members for different Contract Renewal entitlements and steps. Collect overall contract renewal data, prepare status/ progress reports and present to GBE team. 3. Principal Networks & Contact Links Internal Service Contract Pole Operations Manager Service Contract Support Pole Lead - Matrix Manager - first point of escalation Service Operation Leaders Regional Service Operations Managers Field Service Manager Global Customer Care A360 Performance Managers ISA Managers Asset Support Team Contract Proposal Team ISLC External None 4. Supervisory Responsibilities None 5. Geographic Scope & Travel Requirements Located within a central location (Hadapsar, Pune, India) Adherence to local office working policy Typically assigned to a particular pole, handling # service contracts within the pole. Working hours afternoon to midnight shift (2PM to 6PM from office and 8PM to 12:00AM from home). This can be changed based on organization policy and pole in which candidate is working. Travel not required for primary task, on exception base for secondary tasks (e. g. training/workshops) 6. Key Performance Measures RNOC given to SOL as per current SLA Zero renewal cases without renewal opportunity 100% welcome packets issued where needed, excluding exceptions 100% renewal case for active renewal entitlements CPQ adoption as per plan PSC rejections due to CSS error Corrective actions Weekly update Next Step Comment 1. Education Required Bachelors Degree - Administrative or technical; OR 3-4 years Honeywell Process Solutions / LSS Experience in similar positions 2. Work Experience Required 7-8 years of experience with process controls/pricing-proposal environment 3-4 years of experience in Honeywell LSS organization (Preferred, not required) Excellent working knowledge of SFDC, CPQ and SAP, MS Word and MS Excel 3. Technical Skills & Specific Knowledge Required Strong Math skills, including basic commercial awareness (booking margins, cash flow) Basic knowledge of pricing of a service agreements. 4. Behavioural Competencies Required Able to forge strong internal business relationships and deliver on commitments. Demonstrates a strong commercial awareness. Excellent interpersonal skills as well as good verbal, written and presentation skills. Ability to multi-task and prioritise work. Self-motivated and able to work with minimum supervision. Demonstrates a high level of planning & organisation skills daily. Highly Customer Focused approach, demonstrating success through a Voice of the Customer approach daily. Highly self-aware, recognising the impact of approach and behaviours on peers, direct reports, customers and other internal and external contacts. Ability to work within a remote team and support each other when needed Daily demonstration of the Honeywell Behaviours. 5. Language Requirements Fluent in English 1. Education Required Bachelors Degree - Administrative or technical; OR 3-4 years Honeywell Process Solutions / LSS Experience in similar positions 2. Work Experience Required 7-8 years of experience with process controls/pricing-proposal environment 3-4 years of experience in Honeywell LSS organization (Preferred, not required) Excellent working knowledge of SFDC, CPQ and SAP, MS Word and MS Excel 3. Technical Skills & Specific Knowledge Required Strong Math skills, including basic commercial awareness (booking margins, cash flow) Basic knowledge of pricing of a service agreements. 4. Behavioural Competencies Required Able to forge strong internal business relationships and deliver on commitments. Demonstrates a strong commercial awareness. Excellent interpersonal skills as well as good verbal, written and presentation skills. Ability to multi-task and prioritise work. Self-motivated and able to work with minimum supervision. Demonstrates a high level of planning & organisation skills daily. Highly Customer Focused approach, demonstrating success through a Voice of the Customer approach daily. Highly self-aware, recognising the impact of approach and behaviours on peers, direct reports, customers and other internal and external contacts. Ability to work within a remote team and support each other when needed Daily demonstration of the Honeywell Behaviours. 5. Language Requirements Fluent in English
Posted 2 weeks ago
7.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Remote
Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 2 weeks ago
8.0 - 10.0 years
7 Lacs
Pune
Work from Office
About Tarento: Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions. Were proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Job Details: Experience: 8+ years of experience predominantly in data related disciplines such as data governance, data quality and data cleansing in oil and gas or financial services domain Roles & Responsibilities: Experience of working on data management tools such as Alation and MDG Demonstrate deep understanding of the data governance framework and play a key SME role supporting the Data Governance manager in designing processes for consistent implementation Good understanding of data visualization platforms such as Power BI, Tableau or Qlikview Exposure to data analytics, machine learning, artificial intelligence In-depth understanding of procurement, finance, customer business processes Solid knowledge of data governance concepts around data definition and catalog, data ownership, data lineage, data policies and controls, data monitoring and data governance forums Partner with the business and program team teams to document business data glossary for assigned domain by capturing data definitions, data standards, data lineage, data quality rules and KPIs. Ensure the data glossary always remains up to date by following a stringent change governance. Ensure smooth onboarding for data owners and data stewards by providing them necessary trainings to carry out their role effectively. Engage with them on a regular basis to provide progress updates and to seek support to eliminate impediments if any. Extensive knowledge on Customer master and Material master Data by understanding integration with upstream and downstream legacy systems Demonstrate deep understanding of the data governance framework and play a key SME role supporting the Data Governance manager in crafting processes for consistent implementation Ensure adherence to policies related to data privacy, data lifecycle management and data quality management for the assigned data asset Build a rapport with business stakeholders, technology team, program team and wider digital solution and transformation team to identify opportunities and areas to make a difference through the implementation of data governance framework. Expert knowledge of data governance concepts around data definition and catalog, data ownership, data lineage, data policies and controls, data monitoring and data governance forums Deep knowledge of SAP ERP and associated data structures Must have been part of large, multi-year transformational change across multiple geographies across multiple data domains Comfortable to interact with senior stakeholders and chair meetings/trainings related to data governance Soft Skills Active listening, communication and collaboration, presentation, Problem solving, , Stakeholder management Project management. Domain knowledge [Procurement, Finance, Customer], Business Acumen, Critical thinking, Story telling Awareness of best practices and emerging technologies in data management, data analytics space
Posted 2 weeks ago
9.0 - 11.0 years
12 - 16 Lacs
Pune
Work from Office
About Tarento: Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions. Were proud to be recognized as a Great Place to Work , a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose. Job Details: Experience: 9+ years of experience predominantly in data related disciplines such as Data Governance, SAP master Data and data quality in oil and gas or financial services domain Technology: Deep knowledge of SAP ERP and associated data structures Job Description: Coordinating with Data Srewards/Data Owners to enable identification of Critical data elements for SAP master Data - Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Develop both end-state and interim-state architecture for master data, ensuring alignment with business requirements and industry best practices. Define and implement data models that align with business needs and Gather requirements for master data structures. Design scalable and maintainable data models by ensuring data creation through single source of truth Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Collaborate with the Data Governance Manager to advance the data governance agenda. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities. Collaborate with IT, data management, and business units to implement data governance best practices and migrate from ECC to S/4 MDG Monitor data governance activities, measure progress, and report on key metrics to senior management. Conduct training sessions and create awareness programs to promote data governance within the organization. Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc. Primary Skills Experience of implementing data governance in SAP environment both transactional and master data Expert knowledge of data governance concepts around data definition and catalog, data ownership, data lineage, data policies and controls, data monitoring and data governance forums Ability to influence senior stakeholders and key business contact to gather and review the requirements for MDG Proven experience in driving Master data solutioning to implement S/4 Hana Green field. Strong knowledge on SAP peripheral systems and good understanding of Upstream and downstream impact of master Data Strong understanding of Master data attributes and its impact Strong analytical and problem-solving abilities. Soft Skills Active listening, communication and collaboration, presentation, Problem solving, , Team management, Stakeholder management Project management. Domain knowledge [Procurement, Finance, Customer], Business Acumen, Critical thinking, Story telling Stay updated with industry trends, best practices and emerging technologies in data management, data analytics space
Posted 2 weeks ago
2.0 - 5.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Some careers have more impact than others. If you re looking for a career where you can make a real impression, join HSBC and discover how valued you ll be. HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions. We are currently seeking an experienced professional to join our team in the role of Cross Controls Business Analyst Principal responsibilities The role holder will report to their local Functional Manager within the Data and Analytics Office (DAO). He / She will support the design, refinement, implementation, and operation of controls, with a particular focus on deliverables that span across the data control suite. This may include areas such as end user computing, third party related data risk management, and automation. The role holder will support the Global Work Stream while operating from India to engage stakeholders & SMEs from Businesses and Functions to understand and assess regulatory & HSBC s Data Management policy requirements. The role holder will be responsible for analysis and documentation of data management best practices, including data standards, lineage, quality & process level controls; and support key control design decisions. The role holder is expected to apply and uphold HSBC s standards and guidelines at all times. Support the delivery of key cross data control initiatives. This includes supporting operating model changes to achieve the team objectives. Develop and maintain key control documentation & Support in the execution of end user computing controls and governance; including working with businesses and functions to ensure policy adherence. Run and contribute to delivery pods and internal/ external staff supporting control design and operation. Drive business and data conversations to close process gaps through business and technology improvement & Support the defining of requirements and propose optimization to controls and processes. Requirements 8+ years of relevant experience. Bachelor s or Master s degree from reputed university with specialization in numerical discipline and concentration in computer science, information systems or other engineering specializations. Detail knowledge of data management framework data governance, data privacy, business architecture and data quality. Strong analytical skills with business analysis aptitude. Ability to comprehend intricate and diverse range of business problems and analyze them with limited or complex data and provide a feasible solution framework. Experience on working on Data/Information Management projects of varying complexities. Knowledge and understanding of financial-services/ banking-operations in a Global Bank. Understanding of accounting principles, data flows and regulatory landscape BCBS239, B3R, CCAR, IFRS9 etc. You ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc. , We consider all applications based on merit and suitability to the role. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.
Posted 2 weeks ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client’s needs. Your primary responsibilities include* Design, build, optimize and support new and existing data models and ETL processes based on our client’s business requirements Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes. Analyse and model data to ensure optimal ETL design and performance. Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components Preferred technical and professional experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration. Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
Posted 2 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration Administration-Informatica Administration Preferred Skills: Technology-Data Management - Data Integration Administration-Informatica Administration
Posted 2 weeks ago
3.0 - 4.0 years
5 - 6 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Description Circle K (Part of Alimentation Couche-Tard Inc., (ACT)) is a global Fortune 200 company. A leader in the convenience store and fuel space, it has a footprint across 31 countries and territories. Circle K India Data & Analytics team is an integral part of ACT s Global Data & Analytics Team, and the Data Scientist will be a key player on this team that will help grow analytics globally at ACT. This is a unique opportunity to be a part of an experienced team of data scientists and analysts within a large organization. The Data Scientist is responsible for delivering advanced analytics and insights that drive business results and operational excellence to our dynamic and forward-thinking Merchandise team in Europe. The ideal candidate should possess both technical capabilities as well as commercial savviness, should be able to drive independent analysis as well as work effectively in a group. About the role We are looking for an individual who is a proven problem solver with exceptional critical thinking ability. The candidate should have a high sense of curiosity and be comfortable with ambiguity when faced with a difficult challenge. Additionally, the candidate should possess excellent communication skills, the ability to collaborate with others, and simply and effectively communicate complex concepts with a non-technical audience. Roles & Responsibilities Analytics (Data & Insights) Evaluate performance of categories and activities, using proven and advanced analytical methods Support stakeholders with actionable insights based on transactional, financial or customer data on an ongoing basis Oversee the design and measurement of experiments and pilots Initiate and conduct advanced analytics projects such as clustering, forecasting, causal impact Build highly impactful and intuitive dashboards that bring the underlying data to life through insights Operational Excellence Improve data quality by using and improving tools to automatically detect issues Develop analytical solutions or dashboards using user-centric design techniques in alignment with ACT s protocol Study industry/organization benchmarks and design/develop analytical solutions to monitor or improve business performance across retail, marketing, and other business areas Stakeholder Management Work with Peers, Functional Consultants, Data Engineers, and cross-functional teams to lead / support the complete lifecycle of analytical applications, from development of mock-ups and storyboards to complete production ready application Provide regular updates to stakeholders to simplify and clarify complex concepts, and communicate the output of work to business Create compelling documentation or artefacts that connects business to the solutions Coordinate internally to share key learning with other teams and lead to accelerated business performance Be an advocate for a data-driven culture among the stakeholders Job Requirements Education A higher degree in an analytical discipline like Finance, Mathematics, Statistics, Engineering, or similar Relevant Experience Experience: 3-4 years for Data Scientist Relevant working experience in a quantitative/ applied analytics role Experience with programming, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Spark / SQL / Python Excellent communication skills in English, both verbal and written Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Functional Analytics (Retail Analytics, Supply Chain Analytics, Marketing Analytics, Customer Analytics, etc.) Working understanding of Statistical modelling & Time Series Analysis using Analytical tools (Python, PySpark, R, etc.) Knowledge of statistics and experimental design (A/B testing, hypothesis testing, causal inference) Practical experience building scalable ML models, feature engineering, model evaluation metrics, and statistical inference Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), database management systems Business intelligence & reporting (Power BI) Cloud computing services in Azure/ AWS/ GCP for analytics #LI-DS1
Posted 2 weeks ago
7.0 - 13.0 years
9 - 15 Lacs
Bengaluru
Work from Office
Required Skills Technology | Infrastructure Monitoring Tool - Splunk Domain | IT in Banking | Customer Support Behavioral | Aptitude | Communication Education Qualification : Any Graduate (Engineering / Science) Certification Mandatory / Desirable : Technology | SESC/SE As a Level 3 Splunk Administrator, you will be responsible for advanced configuration, optimization, and management of Splunk environments for data analytics, log management, and security monitoring. You will lead the development of strategies, provide expert support, and ensure the effectiveness of our Splunk solutions. Key Responsibilities: 1. Splunk Environment Design and Optimization: - Lead the design, architecture, and advanced optimization of Splunk Enterprise, Universal Forwarders, and Splunk apps. - Customize Splunk settings, indexes, and data sources for maximum performance, scalability, and reliability. 2. Data Ingestion and Indexing: - Design and implement advanced data ingestion strategies from various sources into Splunk, ensuring data quality and reliability. - Oversee data indexing and categorization for efficient search, analysis, and correlation. 3. Advanced Searches and Alerts: - Perform complex searches, queries, and correlations in Splunk to retrieve and analyze data. - Configure advanced alerts, notifications, and incident response workflows for comprehensive security and performance monitoring. 4. Data Analysis and Reporting: - Utilize advanced data analysis techniques, statistical analysis, and machine learning to derive actionable insights from Splunk data. - Create advanced reports, dashboards, and predictive analytics for improved data analysis and incident management. 5. Automation and Scripting: - Develop and maintain advanced automation scripts and apps using Splunk SPL, REST API, and other relevant technologies to streamline data collection and incident response. - Implement automation for proactive issue resolution and resource provisioning. 6. Documentation and Knowledge Sharing: - Maintain comprehensive documentation of Splunk configurations, changes, and best practices. - Mentor and train junior administrators, sharing expertise, best practices, and providing advanced training.
Posted 2 weeks ago
7.0 - 9.0 years
9 - 11 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day. At Circle K, we are building a best-in-class global data engineering practice to support intelligent business decision-making and drive value across our retail ecosystem. As we scale our engineering capabilities, we re seeking a Lead Data Engineer to serve as both a technical leader and people coach for our India-based Data Enablement pod. This role will oversee the design, delivery, and maintenance of critical cross-functional datasets and reusable data assets while also managing a group of talented engineers in India. This position plays a dual role: contributing hands-on to engineering execution while mentoring and developing engineers in their technical careers. About the role The ideal candidate combines deep technical acumen, stakeholder awareness, and a people-first leadership mindset. You ll collaborate with global tech leads, managers, platform teams, and business analysts to build trusted, performant data pipelines that serve use cases beyond traditional data domains. Responsibilities Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines) Architect data models and re-usable layers consumed by multiple downstream pods Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks Mentoring and coaching team Partner with product and platform leaders to ensure engineering consistency and delivery excellence Act as an L3 escalation point for operational data issues impacting foundational pipelines Own engineering best practices, sprint planning, and quality across the Enablement pod Contribute to platform discussions and architectural decisions across regions Job Requirements Education Bachelor s or master s degree in computer science, Engineering, or related field Relevant Experience 7-9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse Knowledge and Preferred Skills Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, Snowflake, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use : Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI #LI-DS1
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. . Data Streaming Engineer: - Required Skills and Competencies: - Experience: 3+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all.
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. Data Streaming Engineer: - Required Skills and Competencies: - Experience: 3+ Years. Expertise in Python Language is MUST. SQL (should be able to write complex SQL Queries) is MUST Hands on experience in Apache Flink Streaming Or Spark Streaming MUST Hands On expertise in Apache Kafka experience is MUST Data Lake Development experience. Orchestration (Apache Airflow is preferred). Spark and Hive: Optimization of Spark/PySpark and Hive apps Trino/(AWS Athena) (Good to have) Snowflake (good to have). Data Quality (good to have). File Storage (S3 is good to have) Our Offering:- Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.
Posted 2 weeks ago
1.0 - 6.0 years
3 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Description Circle K (Part of Alimentation Couche-Tard group) is a global leader in the convenience store and fuel space, it has a footprint across 31 countries and territories. At the Circle K Business Centre in India, we are #OneTeam using the power of data analytics to drive our decisions and strengthen Circle K s global capabilities. We make it easy for our customers all over the world - we partner with the business to empower the right decisions and deliver effectively, while rapidly unlocking value for our customers across the enterprise. Our team in India is an integral part of our talent ecosystem that helps advance us on our journey to becoming a data-centric company. The future of data analytics at Circle K is bright - and we re only just getting started. About the role The India Data & Analytics Global Capability Centre is an integral part of ACT s Global Data & Analytics Team, and the Associate Data Analyst will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. The incumbent will be responsible for deploying analytics algorithms and tools on chosen tech stack for efficient and effective delivery. Responsibilities include delivering insights and targeted action plans, address specific areas of risk and opportunity, work cross-functionally with business and technology teams, and leverage the support of global teams for analysis and data. Roles & Responsibilities Analytics (Data & Insights) Clean and organize large datasets for analysis and visualization using statistical methods; verify and ensure accuracy, integrity, and consistency of data Identifying trends and patterns in data and using this information to drive business decisions Create the requirement artefacts e.g., Functional specification document, use cases, requirement traceability matrix, business test cases and process mapping documents, user stories for analytics projects Build highly impactful and intuitive dashboards that bring the underlying data to life through insights Generate ad-hoc analysis for leadership to deliver relevant, action-oriented, and innovative recommendations Operational Excellence Improve data quality by using and improving tools to automatically detect issues Develop analytical solutions or dashboards using user-centric design techniques in alignment with ACT s protocol Study industry/organization benchmarks and design/develop analytical solutions to monitor or improve business performance across retail, marketing, and other business areas Stakeholder Management Work with high-performing Functional Consultants, Data Engineers, and cross-functional teams to lead / support the complete lifecycle of visual analytical applications, from development of mock-ups and storyboards to complete production ready application Provide regular updates to stakeholders to simplify and clarify complex concepts, and communicate the output of work to business Create compelling documentation or artefacts that connects business to the solutions Coordinate internally to share key learning with other teams and lead to accelerated business performance Behavioral Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Functional Analytics (Retail Analytics, Supply Chain Analytics, Marketing Analytics, Customer Analytics, etc.) Working understanding of Statistical modelling using Analytical tools (Python, PySpark, R, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server, etc.), and non-relational (MongoDB, DynamoDB) database management systems Business intelligence & reporting (Power BI, Tableau, Alteryx, etc.) Cloud computing services in Azure/AWS/GCP for analytics Education Bachelor s degree in computer science, Information Management or related technical fields Experience 1+ years for Associate. Data Analyst Relevant working experience in a quantitative/applied analytics role Experience with programming and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Spark / SQL / Python #LI-DS1
Posted 2 weeks ago
3.0 - 5.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Preferred Skills: Technology-ETL & Data Quality-ETL - Others
Posted 2 weeks ago
3.0 - 12.0 years
5 - 14 Lacs
Gurugram
Work from Office
Responsibilities Serving as a core member of an agile team that drives user story analysis and elaboration, designs and develops responsive web applications using the best engineering practices Performing hands-on software development, typically spending most of time actually writing code and unit tests, doing proof of concepts, conducting code reviews, and testing in ongoing sprints Performing ongoing refactoring of code, and delivering continuous improvement Developing deep understanding of integrations with other systems and platforms within the supported domains Manage your own time, and work well both independently and as part of a team. Bring a culture of innovation, ideas, and continuous improvement Challenging status quo, demonstrate risk taking, and implement creative ideas Work closely with product managers, back-end and other front-end engineers to implement versatile solutions to tricky web development problems Embrace emerging standards while promoting best practices and consistent framework usage. Qualifications: BS or MS degree in computer science, computer engineering, or other technical discipline Total Experience: 3-12 Years ; 2+ years experience working in Java and able to demonstrate good Java knowledge Java 7 and Java8 preferred Able to demonstrate good web fundamentals and HTTP protocol knowledge Good attitude, communication, willingness to learn and collaborate 2+ yrs development experience in developing Java applications in an enterprise setting 2+ yrs experience developing java applications in frameworks such as Spring, Spring Boot, Drop wizard f is a plus 2+ years Experience with Test Driven Development (TDD) / Behavior Driven Development (BDD) practices, unit testing, functional testing, system integration testing, regression testing, GUI testing, web service testing, and browser compatibility testing, including frameworks such as Selenium, WebDriverIO, Cucumber, JUnit, Mockito Experience with continuous integration and continuous delivery environment 2+ yrs working in an Agile or SAFe development environment is a plus Data Engineer : Responsibilities The Ideal candidate will be responsible for Designing, Developing and maintaining data pipelines. Serving as a core member of an agile team that drives user story analysis and elaboration, designs and develops responsive web applications using the best engineering practices You will closely work with data scientists, analysts and other partners to ensure the flawless flow of data. You will be Building and optimize reports for analytical and business purpose. Monitor and solve data pipelines issues to ensure smooth operation. Implementing data quality checks and validation process to ensure the accuracy completeness and consistency of data Implementing data governance policies , access controls , and security measures to protect critical data and ensure compliance. Developing deep understanding of integrations with other systems and platforms within the supported domains. Bring a culture of innovation, ideas, and continuous improvement. Challenging status quo, demonstrate risk taking, and implement creative ideas Lead your own time, and work well both independently and as part of a team. Adopt emerging standards while promoting best practices and consistent framework usage. Work with Product Owners to define requirements for new features and plan increments of work. Qualifications BS or MS degree in computer science, computer engineering, or other technical subject area or equivalent 3+ years of work experience 3-12 Years At least 5 year of hands-on experience with SQL, including schema design, query optimization and performance tuning. Experience with distributed computing frameworks like Hadoop,Hive,Spark for processing large scale data sets. Proficiency in any of the programming language python, pyspark for building data pipeline and automation scripts. Understanding of cloud computing and exposure to any cloud GCP,AWS or Azure. knowledge of CICD, GIT commands and deployment process. Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues and optimize data processing workflows Excellent communication and collaboration skills. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally:
Posted 2 weeks ago
5.0 - 12.0 years
7 - 14 Lacs
Hyderabad
Work from Office
SAP HANA DB Modeler/Developer" is responsible for designing, developing, and maintaining data models within the SAP HANA database, utilizing advanced modeling techniques to optimize data access and analysis for reporting and applications, often collaborating with business analysts to translate requirements into efficient database structures while ensuring data integrity and performance within the HANA platform. Must have skills should include: Hana Views( Analytical, Calculation etc.) , SAP HANA XS JavaScript and XS OData services, Advanced DB Modelling for Hana, SAP HANA Data Services ETL based Replication amongst others. Minimum 2 3 end to end implementations. Key responsibilities may include: Data Modeling: Designing and creating complex data models in SAP HANA Studio using analytical views, attribute views, calculation views, and hierarchies to represent business data effectively. Implementing data transformations, calculations, and data cleansing logic within the HANA model. Optimizing data structures for fast query performance and efficient data access. Development: Writing SQL scripts and stored procedures to manipulate and retrieve data from the HANA database. Developing custom HANA functions (CE functions) for advanced data processing and calculations. Implementing data loading and ETL processes to populate the HANA database. Performance Tuning: Analyzing query performance and identifying bottlenecks to optimize data access and query execution. Implementing indexing strategies and data partitioning for improved query performance. Collaboration: Working closely with business analysts to understand data requirements and translate them into technical data models. Collaborating with application developers to integrate HANA data models into applications. Security and Governance: Implementing data security measures within the HANA database, defining user roles and permissions. Maintaining data quality and consistency by defining data validation rules. Required Skills: Technical Skills: Strong understanding of relational database concepts and data modeling principles. Expertise in SAP HANA modeling tools and features (HANA Studio, Calculation Views, Analytical Views) Proficiency in SQL and SQL optimization techniques Knowledge of data warehousing concepts and best practices Soft Skills: Excellent analytical and problem solving abilities Strong communication skills to collaborate with business users and technical teams Ability to work independently and as part of a team
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Mumbai
Work from Office
Operational Excellence with seamless, cost effective and efficient lean operations aligned to customer needs and organization strategy by synergizing services of the partners with quality Operational Excellence with strong interlock with Service Providers (SPs) as OPT, Westcon, Octopian, OEM and Internal stakeholders Ensure KPIs, SLAs met, SC Spend within budget, Customer deliverables are maintained by services providers Translate customer needs in operational requirement for effective execution by SPs and / or OEMs Govern quality of services by services providers is as per standards with robust mechanism Drive Transformation & CI projects for cost reductions, optimization and state of the art lean operations Evolution & Change management - transition of transformation projects into operational environment SME for Organization wide or functional transformation projects including digital and data Align vendors processes and execution policies to the OBS organizational strategies Escalation and Exception Management with OPT and Internal Stakeholders Identification and execution of Repair and reuse opportunities contributing to circular economy and green act Anticipate the changes, skills enhancement needs for the team and drive such programs Stakeholder collaboration and alignment, Team leadership, development & evolution Ensure Supply Chain Data quality & Business Analytics driven culture Customer Satisfaction Spend Control, Cost Reduction & Avoidance Lean Operations Transformation & Continuous Improvements Initiatives SC Risk Management Lean Methodologies Governance through Data Analytics Customer Orientation Global Delivery & Operations
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Mumbai
Work from Office
The SME - Global Procurement Operations is responsible for overseeing outsourced procurement activities, ensuring operational excellence, managing User Acceptance Testing (UAT), handling critical escalations, and supporting process improvements The role acts as a key point of contact for outsourced vendors and internal stakeholders, ensuring seamless execution of procurement processes Operational Excellence with seamless, cost effective and efficient lean operations aligned to customer needs and organization strategy by synergizing services of the partners with quality Operational Excellence with strong interlock with Service Providers (SPs) as OPT, Westcon, Octopian, OEM and Internal stakeholders Ensure KPIs, SLAs met, SC Spend within budget, Customer deliverables are maintained by services providers Translate customer needs in operational requirement for effective execution by SPs and / or OEMs Govern quality of services by services providers is as per standards with robust mechanism Drive Transformation & CI projects for cost reductions, optimization and state of the art lean operations Evolution & Change management - transition of transformation projects into operational environment SME for Organization wide or functional transformation projects including digital and data Align vendors processes and execution policies to the OBS organizational strategies Escalation and Exception Management with OPT and Internal Stakeholders Identification and execution of Repair and reuse opportunities contributing to circular economy and green act Anticipate the changes, skills enhancement needs for the team and drive such programs Stakeholder collaboration and alignment, Team leadership, development & evolution Ensure Supply Chain Data quality & Business Analytics driven culture Customer Satisfaction Spend Control, Cost Reduction & Avoidance Lean Operations Transformation & Continuous Improvements Initiatives SC Risk Management Lean Methodologies Governance through Data Analytics Customer Orientation Global Delivery & Operations
Posted 2 weeks ago
6.0 - 10.0 years
6 - 10 Lacs
Greater Noida
Work from Office
SQL DEVELOPER: Design and implement relational database structures optimized for performance and scalability. Develop and maintain complex SQL queries, stored procedures, triggers, and functions. Optimize database performance through indexing, query tuning, and regular maintenance. Ensure data integrity, consistency, and security across multiple environments. Collaborate with cross-functional teams to integrate SQL databases with applications and reporting tools. Develop and manage ETL (Extract, Transform, Load) processes for data ingestion and transformation. Monitor and troubleshoot database performance issues. Automate routine database tasks using scripts and tools. Document database architecture, processes, and procedures for future reference. Stay updated with the latest SQL best practices and database technologies.Data Retrieval: SQL Developers must be able to query large and complex databases to extract relevant data for analysis or reporting. Data Transformation: They often clean, join, and reshape data using SQL to prepare it for downstream processes like analytics or machine learning. Performance Optimization: Writing queries that run efficiently is key, especially when dealing with big data or real-time systems. Understanding of Database Schemas: Knowing how tables relate and how to navigate normalized or denormalized structures is essential. QE: Design, develop, and execute test plans and test cases for data pipelines, ETL processes, and data platforms. Validate data quality, integrity, and consistency across various data sources and destinations. Automate data validation and testing using tools such as PyTest, Great Expectations, or custom Python/SQL scripts. Collaborate with data engineers, analysts, and product managers to understand data requirements and ensure test coverage. Monitor data pipelines and proactively identify data quality issues or anomalies. Contribute to the development of data quality frameworks and best practices. Participate in code reviews and provide feedback on data quality and testability. Strong SQL skills and experience with large-scale data sets. Proficiency in Python or another scripting language for test automation. Experience with data testing tools Familiarity with cloud platforms and data warehousing solutions
Posted 2 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
Bengaluru
Work from Office
Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. Key Responsibilities : A Data Engineer designs data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. - Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric). - Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 & Azure Blob storage). - Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions. - Optimize data pipelines in the Azure environment for performance, scalability, and reliability. - Ensure data quality and integrity through data validation techniques and frameworks. - Develop and maintain documentation for data processes, configurations, and best practices. - Monitor and troubleshoot data pipeline issues to ensure timely resolution. - Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge. - Manage the CI/CD process for deploying and maintaining data solutions.
Posted 2 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Location: Bangalore/Hyderabad/Pune Experience level: 7+ Years About the Role We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field. 8+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 2 weeks ago
2.0 - 5.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Cigna TTK Health Insurance Company Limited is looking for Data Measurement & Reporting Analyst to join our dynamic team and embark on a rewarding career journey Data Collection: Collect and extract data from various sources, such as databases, spreadsheets, and software applications Data Analysis: Analyze data to identify trends, patterns, and anomalies, using statistical and data analysis techniques Report Development: Create, design, and develop reports and dashboards using reporting and data visualization tools, such as Excel, Tableau, Power BI, or custom-built solutions Data Cleansing: Ensure data accuracy and consistency by cleaning and validating data, addressing missing or incomplete information Data Interpretation: Translate data findings into actionable insights and recommendations for management or stakeholders KPI Monitoring: Track key performance indicators (KPIs) and metrics, and report on performance against goals and targets Trend Analysis: Monitor and report on long-term trends and make predictions based on historical data Ad Hoc Reporting: Generate ad hoc reports and analyses in response to specific business questions or requests Data Automation: Develop and implement automated reporting processes to streamline and improve reporting efficiency Data Visualization: Create visually appealing charts, graphs, and presentations to make data more understandable and accessible to non-technical stakeholders Data Governance: Ensure data quality and compliance with data governance and security policies
Posted 2 weeks ago
9.0 - 10.0 years
11 - 12 Lacs
Bengaluru
Work from Office
Job Title Senior Data Engineer (Hybrid) Job Description For more than 80 years, Kaplan has been a trailblazer in education and professional advancement. We are a global company at the intersection of education and technology, focused on collaboration, innovation, and creativity to deliver a best in class educational experience and make Kaplan a great place to work. Our offices in India opened in Bengaluru in 2018. Since then, our team has fueled growth and innovation across the organization, impacting students worldwide. We are eager to grow and expand with skilled professionals like you who use their talent to build solutions, enable effective learning, and improve students lives. The future of education is here and we are eager to work alongside those who want to make a positive impact and inspire change in the world around them. The Senior Data Engineer at Kaplan North America (KNA) within the Analytics division will work with world class psychometricians, data scientists and business analysts to forever change the face of education. This role is a hands-on technical expert who will own the design and implementation of an Enterprise Data Warehouse powered by AWS RA3 as a key feature of our Lake House architecture. The perfect candidate is an expert in data warehousing technical components (e.g. data modeling, ETL, reporting). You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud). You should be able to work with business customers in a fast-paced environment understanding the business requirements and implementing data & reporting solutions. Above all you should be passionate about working with big data and someone who loves to bring datasets together to answer business questions and drive change. Primary Responsibilities: Hands-on technical leader. Continually raises the bar for the data engineering function. Leads the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions. These efforts can be either a new data solution or a refactor of an existing solution and include writing a significant portion of the critical-path code. Sets an example through their code, designs and decisions. Provides insightful code reviews and take ownership of the outcome. (You ship it, you own it.) Proactively works to improve data quality and consistency by considering the architecture, not just the code for their solutions. Makes insightful contributions to team priorities and overall data approach, influencing the team s technical and business strategy. Takes the lead in identifying and solving ambiguous problems, architecture deficiencies, or areas where their team bottlenecks the innovations of other teams. Makes data solutions simpler. Leads design reviews for their team and actively participates in design reviews of related development projects. Communicates ideas effectively to achieve the right outcome for their team and customer. Harmonizes discordant views and leads the resolution of contentious issues. Demonstrates technical influence over 1-2 teams, either via a collaborative development effort or by increasing their productivity and effectiveness by driving data engineering best practices (e.g. Code Quality, Data Quality, Logical and Physical Data Modelling, Operational Excellence, Security, etc.). Actively participates in the hiring process and is a mentor to others - improving their skills, their knowledge, and their ability to get things done. Hybrid Schedule: 3 days remote / 2 days in office 30-day notification period preferred Education & Experience: Bachelors degree in Computer Science, Information Systems, Data Science or related field 5+ years of experience as a Data Engineer or a similar role In-depth knowledge of the AWS stack (Redshift, Lambda, Glue, SnS, pySpark, Airflow). Expertise in data modeling, ETL development and data warehousing. 3+ years experience with Python,pySpark or Java, Scala Effective troubleshooting and problem-solving skills Strong customer focus, ownership, urgency and drive. Excellent verbal and written communication skills and the ability to work well in a team. Preferred Qualifications: 3+ years experience with AWS services including S3, RA3, AWS Cloudformation. Ability to distill ambiguous customer requirements into a technical design. Experience providing technical leadership and educating other engineers for best practices on data engineering. Familiarity with Tableau & SSRS. #LI-LD1 Location Bangalore, KA, India Additional Locations Employee Type Employee Job Functional Area Systems Administration/Engineering Business Unit 00072 Kaplan Test Prep At Kaplan, we recognize the importance of attracting and retaining top talent to drive our success in a competitive market. Our salary structure and compensation philosophy reflect the value we place on the experience, education, and skills that our employees bring to the organization, taking into consideration labor market trends and total rewards. All positions with Kaplan are paid at least $15 per hour or $31,200 per year for full-time positions. Additionally, certain positions are bonus or commission-eligible. And we have a comprehensive benefits package, learn more about our benefits here . Diversity & Inclusion Statement : Kaplan is committed to cultivating an inclusive workplace that values diversity, promotes equity, and integrates inclusivity into all aspects of our operations. We are an equal opportunity employer and all qualified applicants will receive consideration for employment regardless of age, race, creed, color, national origin, ancestry, marital status, sexual orientation, gender identity or expression, disability, veteran status, nationality, or sex. We believe that diversity strengthens our organization, fuels innovation, and improves our ability to serve our students, customers, and communities. Learn more about our culture here . Kaplan considers qualified applicants for employment even if applicants have an arrest or conviction in their background check records. Kaplan complies with related background check regulations, including but not limited to, the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. There are various positions where certain convictions may disqualify applicants, such as those positions requiring interaction with minors, financial records, or other sensitive and/or confidential information. Kaplan is a drug-free workplace and complies with applicable laws.
Posted 2 weeks ago
0.0 - 5.0 years
25 - 30 Lacs
Pune
Work from Office
About KPI Partners: KPI Partners is a leader in data and analytics consulting, helping enterprises drive decision-making through powerful data solutions. As part of our growing data engineering team, we are looking for professionals who are passionate about building scalable data pipelines and transforming raw data into actionable insights. Key Responsibilities: Design, develop, and maintain robust and scalable data pipelines using Pyspark /Python , Azure Data Factory , and SQL . Integrate data from various sources including APIs, databases, cloud platforms, and third-party applications. Work closely with business analysts, data scientists, and BI teams to understand data requirements and implement solutions. Implement best practices in data modeling, ETL/ELT development, and data quality validation. Develop and maintain documentation of data flows, transformations, and processes. Optimize performance of data pipelines and ensure efficient processing of large data volumes. Collaborate in an Agile environment, actively participating in sprint planning and delivery. Required Skills & Experience: Strong programming experience in Python for data processing and automation. Hands-on experience with Azure Data Factory (ADF) for orchestration and transformation. Proficiency in writing complex SQL queries and working with relational databases (e.g., SQL Server, PostgreSQL, etc.). Understanding of data warehousing concepts and ETL/ELT design principles. Experience working in cloud environments (preferably Microsoft Azure). Familiarity with source control systems like Git. Good problem-solving and debugging skills. Why Join KPI Partners Work with a team of industry experts on cutting-edge analytics projects. Competitive salary and benefits. Opportunity to grow in a dynamic and collaborative environment. Continuous learning through internal and external training programs. Apply Now if youre ready to be part of a data-driven transformation journey with KPI Partners.
Posted 2 weeks ago
2.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Key Responsibilities: Design and develop end-to-end data pipelines using Azure Data Factory and Azure Databricks Integrate data from various sources including SQL Server, Azure Data Lake, Blob Storage, and third-party APIs Build and optimize ETL/ELT workflows using PySpark/Scala in Azure Databricks Develop and maintain data models and datasets for analytics and reporting Work with data analysts and business stakeholders to understand requirements and translate them into technical solutions Monitor pipeline performance and implement logging, alerting, and recovery strategies Ensure data quality, security, and governance across all workflows Participate in code reviews, technical documentation, and agile ceremonies Collaborate with cloud architects and DevOps teams to automate deployments (CI/CD)
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France