Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
The ideal candidate for the Azure Data Engineer position will have 4-6 years of experience in designing, implementing, and maintaining data solutions on Microsoft Azure. As an Azure Data Engineer at our organization, you will be responsible for designing efficient data architecture diagrams, developing and maintaining data models, and ensuring data integrity, quality, and security. You will also work on data processing, data integration, and building data pipelines to support various business needs. Your role will involve collaborating with product and project leaders to translate data needs into actionable projects, providing technical expertise on data warehousing and data modeling, as well as mentoring other developers to ensure compliance with company policies and best practices. You will be expected to maintain documentation, contribute to the company's knowledge database, and actively participate in team collaboration and problem-solving activities. We are looking for a candidate with a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Data Engineer focusing on Microsoft Azure. Proficiency in SQL and experience with Azure data services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics is required. Strong understanding of data architecture, data modeling, data integration, ETL/ELT processes, and data security standards is essential. Excellent problem-solving, collaboration, and communication skills are also important for this role. As part of our team, you will have the opportunity to work on exciting projects across various industries like High-Tech, communication, media, healthcare, retail, and telecom. We offer a collaborative environment where you can expand your skills by working with a diverse team of talented individuals. GlobalLogic prioritizes work-life balance and provides professional development opportunities, excellent benefits, and fun perks for its employees. Join us at GlobalLogic, a leader in digital engineering, where we help brands design and build innovative products, platforms, and digital experiences for the modern world. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
HI, Greetings from Savantis! We are hiring for one of our clients. Please go through the below details and let us know if you are interested in this opportunity. Job Title: EPM Analyst Experience: 3-6 years of experience Job Location: Hybrid/ Remote Job Description EPM software (development/support of a financial data model). Helping users get the most out of their EPM and BI applications. Maintenance of user administration. Maintain and further develop data models and reports. Back-end BI work such as connecting different data sources and setting up integrations. Modeling data into usable data Front-end BI tools such as Tableau or Power BI. Retrieving business requirements and setting up reports and dashboards based on data models. Adjusting schedules and reports and making sure that these adjustments go live without problems. Preparing the system so the users can start financial processes like month-end closing or forecast. Helping users by analyzing and solving functional issues. Advising our users on performance issues. Experience on one or more of the subjects below. Financial reporting systems, such as Anaplan, Fluence, Vena, OneStream, SAP BPC or Group Reporting, CCH Tagetik or Oracle EPM. Microsoft Stack (Excel, SQL, SSAS, Power BI) SAP BI/BW and SAC Agile way of working Frameworks like ITIL, ASL, BISL. Mandatory Skills: EPM Software: Experience with financial reporting systems like Anaplan, Fluence, Vena, OneStream, SAP BPC, CCH Tagetik, Oracle EPM, or Group Reporting. BI & Data Modeling: Power BI, Tableau, SAP BI/BW, SAC. Microsoft Stack: Excel, SQL, SSAS, Power BI. Financial Data Modeling: Development and support of financial models. Data Integration: Connecting data sources, setting up integrations. System Maintenance & Support: User administration, troubleshooting, and advising on performance issues. Agile & IT Frameworks: ITIL, ASL, BISL methodologies. If you are interested in this opportunity, please share your updated CV. Thanks & Regards, Lakshmi Tulasi Kanna HR - Recruiter M: lakshmitulasi.kanna@savantis.com,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Location Pune/Nagpur Need immediate joiner only Job Description Key Responsibilities : Design, develop, and maintain robust data pipelines and ETL processes using Snowflake and other cloud technologies. Work with large datasets, ensuring their availability, quality, and performance across systems. Implement data models, optimize storage, and query performance on Snowflake. Write complex SQL queries for data extraction, transformation, and reporting purposes. Develop, test, and implement data solutions leveraging Python scripting and Snowflakes native features. Collaborate with data scientists, analysts, and business stakeholders to deliver scalable data solutions. Monitor and troubleshoot data pipelines to ensure smooth operation and efficiency. Perform data migration, integration, and processing tasks across cloud platforms. Stay updated with the latest developments in Snowflake, SQL, and cloud technologies. Required Skills Snowflake: Expertise in building, optimizing, and managing data warehousing solutions on Snowflake. SQL: Strong knowledge of SQL for querying and managing relational databases, writing complex queries, stored procedures, and performance tuning. Python: Proficiency in Python for scripting, automation, and integration within data pipelines. Experience in developing and managing ETL processes, and ensuring data accuracy and performance. Hands-on experience with data migration and integration processes across cloud platforms. Familiarity with data security and governance best practices. Strong problem-solving skills with the ability to troubleshoot and resolve data-related issues. (ref:hirist.tech),
Posted 1 month ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,
Posted 1 month ago
9.0 - 12.0 years
20 - 35 Lacs
Gurugram
Work from Office
Experience: 9+ years in solution and technical architecture with strong software development background Cloud Expertise: Minimum 7 years of experience in cloud application migration , especially AWS or Azure Hands-on experience with hybrid cloud design , AWS serverless , containers Integration & APIs: Strong understanding of application/data integration , including Mulesoft , Apigee , API Connect , Informatica Familiarity with SOA , REST APIs , microservices , ESB , BPM Technology Stack: Exposure to Node.js , Java frameworks , databases, queues, event processing DevOps & Automation: Working knowledge of CI/CD pipelines , testing, and automation best practices Domain Knowledge: BFSI experience and understanding of Open Architecture is preferred Nice to have: Experience in AI , Data Analytics , and presenting architecture to senior stakeholders Please share profiles of candidates who meet the above requirements and have delivered 2- 3 hybrid projects in a technology organization.
Posted 1 month ago
3.0 - 6.0 years
3 - 6 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities:- Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements.- Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements.- Develop and maintain technical documentation for SnapLogic integrations and workflows.- Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: - Must To Have Skills: Strong experience in SnapLogic.- Good To Have Skills: Experience in other ETL tools like Informatica, Talend, or DataStage.- Experience in designing, developing, and maintaining integrations and workflows using SnapLogic.- Experience in analyzing business requirements and developing solutions to meet those requirements.- Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic.- This position is based at our Pune office. Qualification 15 Years of full time education
Posted 1 month ago
10.0 - 15.0 years
12 - 18 Lacs
Bengaluru
Work from Office
Mode: Contract As an Azure Data Architect, you will: Lead architectural design and migration strategies, especially from Oracle to Azure Data Lake Architect and build end-to-end data pipelines leveraging Databricks, Spark, and Delta Lake Design secure, scalable data solutions integrating ADF, SQL Data Warehouse, and on-prem/cloud systems Optimize cloud resource usage and pipeline performance Set up CI/CD pipelines with Azure DevOps Mentor team members and align architecture with business needs Qualifications: 10-15 years in Data Engineering/Architecture roles Extensive hands-on with: Databricks, Azure Data Factory, Azure SQL Data Warehouse Data integration, migration, cluster configuration, and performance tuning Azure DevOps and cloud monitoring tools Excellent interpersonal and stakeholder management skills.
Posted 1 month ago
5.0 - 10.0 years
6 - 10 Lacs
Kolkata
Work from Office
As a consultant you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work developing solutions that excel at user experience, style, performance, reliability and scalability to reduce costs and improve profit and shareholder value. Your primary responsibilities include: Build, automate and release solutions based on clients priorities and requirements. Explore and discover risks and resolving issues that affect release scope, schedule and quality and bring to the table potential solutions. Make sure that all integration solutions meet the client specifications and are delivered on time Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 5+ years of experience in IT industry. Minimum of 4+ years of Experience in Oracle Applications and Oracle Cloud in Technical Domain. 2 End to End Implementations in Oracle Supply Chain Management Cloud as Functional Consultant. Should have worked in Inventory, Order Management, Cost Management, GOP Cloud, Data Integration, FBDI, ADFDI Minimum 4+ years of experience in BIP reporting Preferred technical and professional experience You’ll have access to all the technical and management training courses you need to become the expert you want to be. Should have minimum 3 or more years of relevant experience in Oracle Cloud Technical (Oracle Fusion )12c Development and Implementation. Should have good knowledge of integrating with Web Services, XML(Extensible Markup Language) and other API(Application Programming Interface) to transfer the data - from source and target, in addition to database
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 month ago
2.0 - 5.0 years
14 - 17 Lacs
Pune
Work from Office
As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client’s needs. Your primary responsibilities include* Design, build, optimize and support new and existing data models and ETL processes based on our client’s business requirements Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes. Analyse and model data to ensure optimal ETL design and performance. Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components Preferred technical and professional experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration. Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
Posted 1 month ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration-Ab Initio Preferred Skills: Technology-Data Management - Data Integration-Ab Initio
Posted 1 month ago
5.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Data Management - Data Integration-Talend Preferred Skills: Technology-Data Management - Data Integration-Talend
Posted 1 month ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration Administration-Informatica Administration Preferred Skills: Technology-Data Management - Data Integration Administration-Informatica Administration
Posted 1 month ago
3.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Minimum 3-5 years of work experience in SAS EG and SAS CI Hands on experience in data transferring from different sources to SAS database Expertise in Data Step and Proc Step including merge statement , proc sql and macros , SAS functions Experience in automation and SAS reporting Good communication skill is must. Candidate should independently work deliver the project work as well as deal with client . Location Any Infosys DC in India Preferred Skills: Technology-ETL & Data Quality-ETL & Data Quality - SAS-SAS - SAS Data Integration Studio
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCA,BTech,MTech,MCA,MBA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Healthcare Data analyst ,PL/SQL, SQL, Data mapping, STTM creation, Data profiling, Reports Preferred Skills: Domain-Healthcare-Healthcare - ALL
Posted 1 month ago
2.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Overview A Data Engineer will be responsible for Designs, develops, programs and implements Machine Learning solutions , Implements Artificial/Augmented Intelligence systems/Agentic Workflows/Data Engineer Workflows, Performs Statistical Modelling and Measurements by applying data engineering, feature engineering, statistical methods, ML modelling and AI techniques on structured, unstructured, diverse “big data” sources of machine acquire data to generate actionable insights and foresights for real life business problem solutions and product features development and enhancements. A strong understanding of databases, SQL, cloud technologies, and modern data integration and orchestration tools like Azure Data Factory (ADF) are required to succeed in this role. Responsibilities Integrates state-of-the-art machine learning algorithms as well as the development of new methods Develops tools to support analysis and visualization of large datasets Develops, codes software programs, implements industry standard auto ML models (Speech, Computer vision, Text Data, LLM), Statistical models, relevant ML models (devices/machine acquired data), AI models and algorithms Identifies meaningful foresights based on predictive ML models from large data and metadata sources; interprets and communicates foresights, insights and findings from experiments to product managers, service managers, business partners and business managers Makes use of Rapid Development Tools (Business Intelligence Tools, Graphics Libraries, Data modelling tools) to effectively communicate research findings using visual graphics, Data Models, machine learning model features, feature engineering / transformations to relevant stakeholders Analyze, review and track trends and tools in Data Science, Machine Learning, Artificial Intelligence and IoT space Interacts with Cross-Functional teams to identify questions and issues for data engineering, machine learning models feature engineering Evaluates and makes recommendations to evolve data collection mechanism for Data capture to improve efficacy of machine learning models prediction Meets with customers, partners, product managers and business leaders to present findings, predictions, foresights; Gather customer specific requirements of business problems/processes; Identify data collection constraints and alternatives for implementation of models Working knowledge of MLOps, LLMs and Agentic AI/Workflows Programming Skills: Proficiency in Python and experience with ML frameworks like TensorFlow, PyTorch LLM Expertise: Hands-on experience in training, fine-tuning, and deploying LLMs Foundational Model Knowledge: Strong understanding of open-weight LLM architectures, including training methodologies, fine-tuning techniques, hyperparameter optimization, and model distillation. Data Pipeline Development: Strong understanding of data engineering concepts, feature engineering, and workflow automation using Airflow or Kubeflow. Cloud & MLOps: Experience deploying ML models in cloud environments like AWS, GCP (Google Vertex AI), or Azure using Docker and Kubernetes.Designs and implementation predictive and optimisation models incorporating diverse data types strong in SQL, Azure Data Factory (ADF) Qualifications • Minimum Education: o Bachelors, Master's or Ph.D. Degree in Computer Science or Engineering. • Minimum Work Experience (years): o 1+ years of experience programming with at least one of the following languages: Python, Scala, Go. o 1+ years of experience in SQL and data transformation o 1+ years of experience in developing distributed systems using open source technologies such as Spark and Dask. o 1+ years of experience with relational databases or NoSQL databases running in Linux environments (MySQL, MariaDB, PostgreSQL, MongoDB, Redis). • Key Skills and Competencies: o Experience working with AWS / Azure / GCP environment is highly desired. o Experience in data models in the Retail and Consumer products industry is desired. o Experience working on agile projects and understanding of agile concepts is desired. o Demonstrated ability to learn new technologies quickly and independently. o Excellent verbal and written communication skills, especially in technical communications. o Ability to work and achieve stretch goals in a very innovative and fast-paced environment. o Ability to work collaboratively in a diverse team environment. o Ability to telework o Expected travel: Not expected.
Posted 1 month ago
2.0 - 6.0 years
5 - 7 Lacs
Ahmedabad, Aurangabad
Work from Office
Job Title: Alteryx Engineer Location: Bangalore/Mumbai/Ahmedabad/Aurangabad Experience Required: 2-5 years Domain: Manufacturing Job Description: We are seeking a highly skilled Alteryx Engineer with 2-5 years of experience, specifically within the manufacturing domain, to join our dynamic team. The ideal candidate will have a strong background in data preparation, blending, and advanced analytics, coupled with practical experience in the manufacturing industry. This role involves designing, developing, and deploying robust Alteryx workflows to automate data processes, generate insights, and support strategic business decision-making. Key Responsibilities: Workflow Development: Design, develop, and maintain efficient and scalable Alteryx workflows to extract, transform, and load (ETL) data from various sources, ensuring data quality and integrity. Data Blending & Transformation: Perform complex data blending, cleansing, and transformation operations using Alteryx Designer to prepare data for analysis and reporting. Automation: Implement and manage automated data pipelines and analytical processes using Alteryx Server to streamline data delivery and reduce manual efforts. Data Analysis: Analyze complex datasets within Alteryx to identify trends, patterns, and insights that drive strategic decisions and operational improvements. Data Integration: Work with diverse data sources, including SAP, flat files (Excel, CSV), APIs, and other enterprise systems, to ensure accurate and timely data availability within Alteryx workflows. Collaboration: Collaborate closely with business stakeholders, including production, supply chain, and quality assurance teams, to gather requirements, understand their data needs, and translate them into effective Alteryx solutions. Reporting & Output: Configure Alteryx workflows to generate various outputs, including data extracts for reporting tools, analytical datasets, and automated reports. Troubleshooting: Diagnose, resolve, and optimize issues related to Alteryx workflows, data connections, and performance promptly. Required Skills: Experience: 2-5 years of hands-on experience in Alteryx workflow development, data preparation, and automation, with a strong focus on the manufacturing domain. Technical Proficiency: Strong proficiency in Alteryx Designer for building complex analytical workflows. Experience with Alteryx Server for deploying, scheduling, and managing workflows is highly desirable. Data Management: Hands-on experience with SQL and relational databases for querying, data extraction, and understanding database structures. Experience extracting and integrating data from SAP systems using Alteryx connectors or other relevant methods is crucial. Analytical Skills: Strong analytical and problem-solving skills with the ability to interpret complex data, identify root causes, and provide actionable insights. Communication: Excellent communication skills with the ability to present complex technical information clearly to both technical and non-technical audiences. Problem-Solving: Proven ability to troubleshoot issues, optimize workflow performance, and resolve data-related challenges effectively in a fast-paced environment. Domain Knowledge: Familiarity with manufacturing processes, operational metrics, supply chain data, and Key Performance Indicators (KPIs) is highly desirable. Preferred Skills: Alteryx Certification (e.g., Alteryx Designer Core, Advanced, or Expert) is a significant plus. Knowledge of other BI tools (e.g., Tableau, Power BI) or data analysis techniques and programming languages (e.g., Python, R) for advanced analytics is advantageous. Experience with data governance and best practices in Alteryx development. Direct experience with SAP modules relevant to manufacturing (e.g., FICO, Production Planning, Materials Management, Sales and Distribution) is a strong asset.
Posted 1 month ago
3.0 - 7.0 years
5 - 8 Lacs
Ahmedabad, Aurangabad
Work from Office
Job Description: We are seeking a skilled Tableau Analyst/Developer with 2-5 years of experience, specifically within the manufacturing domain, to join our dynamic team. The ideal candidate will have a strong background in data visualization and analysis, coupled with practical experience in the manufacturing industry. This role involves designing and developing interactive dashboards and reports to support business decision-making processes. Key Responsibilities: Dashboard Development: Design, develop, and maintain interactive dashboards and reports using Tableau to meet business requirements. Data Analysis: Analyze complex data sets to identify trends, patterns, and insights that drive strategic decisions. Data Integration: Work with data sources, including SQL databases, Excel, and other data management tools, to ensure accurate and timely data availability. Collaboration: Collaborate with business stakeholders, including production, supply chain, and quality assurance teams, to gather requirements and understand their data needs. Reporting: Generate ad-hoc reports and visualizations as requested by stakeholders to support ongoing projects and initiatives. Troubleshooting: Diagnose and resolve issues related to Tableau dashboards and data connections promptly. Required Skills Experience: 2-5 years of experience in Tableau development and data visualization, with a focus on the manufacturing domain. Technical Proficiency: Strong proficiency in Tableau Desktop and Tableau Server, with experience in creating complex calculations, parameters, and visualizations. Data Management: Hands-on experience with SQL and relational databases for querying and data extraction. Analytical Skills: Strong analytical skills with the ability to interpret data and provide actionable insights. Communication: Excellent communication skills with the ability to present complex information clearly to both technical and non-technical audiences. Problem-Solving: Ability to troubleshoot issues and resolve them effectively in a fast-paced environment. Domain Knowledge: Familiarity with manufacturing processes, metrics, and KPIs is highly desirable. Preferred Skills: Tableau Certification (Desktop Specialist, Desktop Certified Associate, or higher) is a plus. Additional Skills : Knowledge of other BI tools and data analysis techniques, such as Python or R, is advantageous.
Posted 1 month ago
6.0 - 11.0 years
8 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are seeking a skilled Lead Data Engineer with extensive experience in Snowflake, ADF, SQL, and other relevant data technologies to join our team. As a key member of our data engineering team, you will play an instrumental role in designing, developing, and managing data pipelines, working closely with cross-functional teams to drive the success of our data initiatives. Key Responsibilities: Design, implement, and maintain data solutions using Snowflake, ADF, and SQL Server to ensure data integrity, scalability, and high performance. Lead and contribute to the development of data pipelines, ETL processes, and data integration solutions, ensuring the smooth extraction, transformation, and loading of data from diverse sources. Work with MSBI, SSIS, and Azure Data Lake Storage to optimize data flows and storage solutions. Collaborate with business and technical teams to identify project needs, estimate tasks, and set intermediate milestones to achieve final outcomes. Implement industry best practices related to Business Intelligence and Data Management, ensuring adherence to usability, design, and development standards. Perform in-depth data analysis to resolve data issues and improve overall data quality. Mentor and guide junior data engineers, providing technical expertise and supporting the development of their skills. Effectively collaborate with geographically distributed teams to ensure project goals are met in a timely manner. Required Technical Skills: T-SQL, SQL Server, MSBI (SQL Server Integration Services, Reporting Services), Snowflake, Azure Data Factory (ADF), SSIS, Azure Data Lake Storage. Proficient in designing and developing data pipelines, data integration, and data management workflows. Strong understanding of Cloud Data Solutions, with a focus on Azure-based tools and technologies. Nice to Have: Experience with Power BI for data visualization and reporting. Familiarity with Azure Databricks for data processing and advanced analytics.
Posted 1 month ago
5.0 - 7.0 years
25 - 35 Lacs
Bengaluru
Remote
This position is responsible for Data Management and architecture of various entities like Product data, customer, ecommerce transactional data, platform development and integration. This position will work closely with all internal departments (Marketing, Sales, Finance, Product Data Team and Outside vendors) to ensure integrating data with external partners and enterprise integrations are working as expected. This member will play key role in various Data strategies in building ahigh functional data warehouse system. Responsibilities: Contribute with your expertise and technical knowledge to develop and maintain architectural roadmap for digital automation and data services ensuring alignment with the business strategies and standards. Develop data models and AI and ML algorithms to apply to data sets. Provide leadership and guidance on design and management of data for data applications, formulate best practices and organize processes for data management, validation, and evolution. Have comprehensive understanding of Master Data Management Concepts when applied to Customer data including but not limited to data collection, unification, transformation, segmentation, and storage Generate electronic product information data syndications to major LWTA customers Implement solutions to enable a stable architecture for collecting robust data sets. Responsible for the maintenance, improvement, cleaning, and manipulation of data in the data platform and analytics databases. Build processes and tools to maintain high data availability, quality, and maintainability. Articulate technology solutions as well as explain the competitive advantages of various technology alternatives. Design data flows, interfaces, CRUD operations etc. Work on Data modeling Conceptual, Logical and Physical Works with product development/marketing and IT/Digital to ensure a smooth data interface is built. Minimum Requirements : Bachelor's degree in mathematics, computer science, computer engineering, information management or a related field or equivalent relevant experience. 5 years of Data Architecture experience. At least 5 to 7 years experience running projects using an Agile project methodology and developing high performing applications. Strong experience with Data Modeling especially with conceptual, Logical and Physical data. Must possess excellent written and oral communications skills and the ability to clearly define projects, objectives, goals, schedules, and assignments. Must possess the ability to work effectively with business personnel at all levels. Good knowledge of programming languages, software tools and analytical methods. Experience with SQL or equivalent database querying language. Minimum 3 years prior experience focused in ETL design, development, review, and testing with additional experience in two or more of the following areas: database development, data modeling, data architecture, data warehouse development, business intelligence, data profiling, database performance optimization. Critical problem solver who can successfully identify, fix and solve problems Ability to multitask and troubleshoot issues quickly. Self-starter, self-motivated, able to work independently, prioritize effectively, and perform multiple tasks under minimal supervision
Posted 1 month ago
6.0 - 8.0 years
12 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Job Title: Oracle Fusion Functional Consultant Cash Management & Lease Accounting Location: Hyderabad / Bangalore Experience: 6-8 Years Department: Oracle ERP Finance Job Summary We are seeking an experienced Oracle Fusion Functional Consultant specializing in Cash Management and Lease Accounting, with strong functional knowledge and hands-on expertise in Fusion Financials. The ideal candidate should have 23 end-to-end implementation and/or support cycles, with a preference for candidates who have previously held Financial Functional Lead roles. Familiarity with Oracle Cloud tools, workflow processes, and prior EBS experience is highly desirable. Key Responsibilities Lead or support implementations of Oracle Fusion Cash Management and Lease Accounting modules. Collaborate with business stakeholders to gather requirements and translate them into functional specifications. Write functional design documents (MD50), test scripts, and support OTBI reports & Fusion analytics. Work on Oracle workflow processes and assist technical teams with integrations and reporting needs. Leverage FSM (Functional Setup Manager) and ADF (Application Development Framework) for configurations and issue resolution. Use Data Integration (DI) tools for mass data uploads and validations. Engage in testing, data migration, UAT, and post-go-live support. Ensure compliance with Oracle Cloud best practices and security standards. Required Skills & Experience 23 implementations or support projects in Oracle Fusion Cash Management & Lease Accounting. Strong hands-on knowledge in Oracle Fusion Financials. Experience with writing functional specs, working on OTBI (Oracle Transactional Business Intelligence) and Fusion Analytics. Solid understanding of workflow processes and how to configure them in Oracle Cloud. Familiarity with Oracle FSM, ADF tools, and Data Integration (DI) tools. Prior experience in Oracle EBS (Financials). Proven ability to work with cross-functional teams and technical counterparts. Strong communication, documentation, and stakeholder management skills Preferred Qualifications Experience in a Financial Functional Lead role in past projects. Oracle Financials Cloud Certification preferred (e.g., General Ledger, Payables, Receivables). Exposure to multi-currency, intercompany, and bank reconciliation processes. Familiarity with Agile/Hybrid project methodologies.
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Location: Bangalore/Hyderabad/Pune Experience level: 7+ Years About the Role We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field. 8+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 1 month ago
6.0 - 8.0 years
12 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Job Title: Oracle Fusion Functional Consultant Cash Management & Lease Accounting Location: Hyderabad / Bangalore Experience: 6-8 Years Department: Oracle ERP – Finance Job Summary We are seeking an experienced Oracle Fusion Functional Consultant specializing in Cash Management and Lease Accounting, with strong functional knowledge and hands-on expertise in Fusion Financials. The ideal candidate should have 2–3 end-to-end implementation and/or support cycles, with a preference for candidates who have previously held Financial Functional Lead roles. Familiarity with Oracle Cloud tools, workflow processes, and prior EBS experience is highly desirable. Key Responsibilities Lead or support implementations of Oracle Fusion Cash Management and Lease Accounting modules. Collaborate with business stakeholders to gather requirements and translate them into functional specifications. Write functional design documents (MD50), test scripts, and support OTBI reports & Fusion analytics. Work on Oracle workflow processes and assist technical teams with integrations and reporting needs. Leverage FSM (Functional Setup Manager) and ADF (Application Development Framework) for configurations and issue resolution. Use Data Integration (DI) tools for mass data uploads and validations. Engage in testing, data migration, UAT, and post-go-live support. Ensure compliance with Oracle Cloud best practices and security standards. Required Skills & Experience 2–3 implementations or support projects in Oracle Fusion Cash Management & Lease Accounting. Strong hands-on knowledge in Oracle Fusion Financials. Experience with writing functional specs, working on OTBI (Oracle Transactional Business Intelligence) and Fusion Analytics. Solid understanding of workflow processes and how to configure them in Oracle Cloud. Familiarity with Oracle FSM, ADF tools, and Data Integration (DI) tools. Prior experience in Oracle EBS (Financials). Proven ability to work with cross-functional teams and technical counterparts. Strong communication, documentation, and stakeholder management skills. Preferred Qualifications Experience in a Financial Functional Lead role in past projects. Oracle Financials Cloud Certification preferred (e.g., General Ledger, Payables, Receivables). Exposure to multi-currency, intercompany, and bank reconciliation processes. Familiarity with Agile/Hybrid project methodologies.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |