Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
15 - 25 Lacs
Noida
Work from Office
We are looking for an experienced Data Engineer with strong expertise in Databricks and Azure Data Factory (ADF) to design, build, and manage scalable data pipelines and integration solutions. The ideal candidate will have a solid background in big data technologies, cloud platforms, and data processing frameworks to support enterprise-level data transformation and analytics initiatives. Roles and Responsibilities Design, develop, and maintain robust data pipelines using Azure Data Factory and Databricks . Build and optimize data flows and transformations for structured and unstructured data. Develop scalable ETL/ELT processes to extract data from various sources including SQL, APIs, and flat files. Implement data quality checks, error handling, and performance tuning of data pipelines. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Work with Azure services such as Azure Data Lake Storage (ADLS) , Azure Synapse Analytics , and Azure SQL . Participate in code reviews, version control, and CI/CD processes. Ensure data security, privacy, and compliance with governance standards. Strong hands-on experience with Azure Data Factory and Azure Databricks (Spark-based development). Proficiency in Python , SQL , and PySpark for data manipulation. Experience with Delta Lake , data versioning , and streaming/batch data processing . Working knowledge of Azure services such as ADLS, Azure Blob Storage, and Azure Key Vault. Familiarity with DevOps , Git , and CI/CD pipelines in data engineering workflows. Strong understanding of data modeling, data warehousing, and performance tuning. Excellent analytical, communication, and problem-solving skills.
Posted 1 month ago
7.0 - 12.0 years
11 - 16 Lacs
Gurugram, Bengaluru
Work from Office
This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies.
Posted 1 month ago
4.0 - 8.0 years
12 - 20 Lacs
Gurugram
Work from Office
IA-Consultant-Data Engineer SSIS ADF : Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in deliveringinnovative and sustainable solutions to a diverse range of clients, includingover 30% of Fortune 500 companies. With a presence in more than 45 countriesacross five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevateour clients' business impact and strategic decision-making. Our team of over4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emergingmarkets like Colombia, the Middle East, and the rest of Asia-Pacific.Recognized by Great Place to Work in India, Chile, Romania, the US, andthe UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-basedculture that prioritizes continuous learning and skill development, work-lifebalance, and equal opportunity for all. About Insights & Advisory We are a global professional services provider offering research, analytics, and business process support services enabled by our innovative 'mind + machine' approach. We are working with over 300+ Fortune 1000 companies. Our TMT team, cater to 4 of the top 5 global Telecom & Networking Infrastructure companies as well as biggest public cloud providers. About this role We are looking for a skilled and motivated Data Engineer/Analyst with 45 years of experience in data engineering, particularly in migrating on-premises systems to cloud-based environments. This role requires strong expertise in SQL Server, SSIS, Azure Data Factory (ADF), Power BI and Microsoft Fabric. The ideal candidate will have hands-on experience designing, developing, and deploying scalable data solutions in Azure, ensuring seamless data integration and high performance. What you will be doing at Evalueserve Lead and execute the migration of on-premises SQL Server databases to Azure SQL. Migrate and modernize legacy SSIS packages from the file system to Azure Data Factory pipelines. Manage end-to-end Microsoft Fabric migration projects, including planning, execution, and post-migration validation. Design and develop stored procedures, SSIS packages, and ADF pipelines to support business data needs. Collaborate with cross-functional teams to understand requirements and deliver scalable, production-ready data solutions. Ensure data quality, workflow optimization, and performance tuning across all stages of data processing What we are looking for 45 years of hands-on experience in data engineering. Proven expertise in SQL Server (on-premises and Azure SQL). Strong experience in SSIS package development and migration. Proficiency in Azure Data Factory (ADF) and cloud-based data integration. Experience with Microsoft Fabric migration and implementation. Proficient with Power BI and Symantec Data Models, measure and views. Solid knowledge of T-SQL, stored procedures, and query optimization. Preferred Qualifications Relevant Microsoft certifications (e.g., Azure Data Engineer Associate) are a plus. Experience with DevOps practices for data pipelines. Strong communication and collaboration skills. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-poweredsupply chain optimization solution built on Google Cloud. HowEvalueserve isnow Leveraging NVIDIA NIM to enhance our AI and digital transformationsolutions and to accelerate AI Capabilities . Know more about how Evalueservehas climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: Thefollowing job description serves as an informative reference for the tasks youmay be required to perform. However, it does not constitute an integralcomponent of your employment agreement and is subject to periodic modificationsto align with evolving circumstances. Please Note :We appreciate the accuracy and authenticity of the information you provide, asit plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure allinformation is factual and submitted on time. For any assistance, your TA SPOCis available to support you .
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
andhra pradesh
On-site
You have an opportunity to join a prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services. The company is headquartered in Bengaluru with global operations in over 60 countries, serving clients across various industries such as financial services, healthcare, manufacturing, retail, and telecommunications. Listed on NASDAQ, the company has a strong workforce of 234,054 employees and a gross revenue of 222.1 billion. Position: Paas Developer Experience: 6-8 years Location: Vizag / Vishakhapatnam Salary: As per market standards Notice Period: 0-15 days/ Serving Mode of Hire: Contract In this role, you will be responsible for integrating and extending Oracle Fusion ERP applications using PaaS services. Your key responsibilities will include: - Developing custom user interfaces and integrating with existing web services within Oracle Fusion applications - Building custom business objects (tables) and designing PaaS extensions for data synchronization with external systems - Creating custom Java applications using PaaS services like Oracle Java Cloud Service (JCS) and Oracle Database Cloud Service (DBCS) - Developing PaaS extensions for custom business object creation in DBaaS or ATP - Designing PaaS extensions using VBCS to create custom applications and UI elements integrating with Oracle Fusion - Customizing existing pages and objects using Application Composer and Page Composer - Implementing appropriate security measures in PaaS extensions such as JWT UserToken - Building solutions in an environment with deployed API Gateway, OIC, SOACS, and WebServices for data interfacing - Leading onshore and offshore resources, conducting discovery sessions, solution workshops, presentations, and POCs The ideal candidate will have: - 6-8 years of Oracle Cloud technical experience - Knowledge of reports using BI Publisher (RTF, XSL templates), OTBI analysis, and dashboards - Experience in resolving technical issues related to Oracle Fusion applications and integrations - Proficiency in performance tuning and optimizing system performance - Basic Oracle SCM and Finance related functional knowledge Join this dynamic team and contribute to the growth and success of a leading IT services organization.,
Posted 1 month ago
4.0 - 7.0 years
9 - 13 Lacs
Hyderabad
Work from Office
PowerBi Modelling Performance tuning and optimization PowerBi embedded APIs Concept of Embedded Tokens Custom UI Integration Row Level Security Concepts Hands on knowledge on incremental refresh powerM scripting Enhanced XMLA scripting
Posted 1 month ago
4.0 - 7.0 years
9 - 14 Lacs
Bengaluru
Work from Office
- Work against the triple constraints of the projects (Time , Cost and Scope) and be able to manage them efficiently in co-ordination with Project Manager - Perform software testing - Contribute to the positive outcome of the business requirement -deliverables and Help solve complex technical problems - Be concerned about the quality of the deliverables and be modest to apply new ideas to improve the process on a continuous improvement mode - Prepare and share KPI Trackers delivery report to Management - Exposure to agile methodologies - Working exposure with customers and be able to translate the business requirements in to Technical Artifacts - Gain knowledge on the Application / process quickly and be able to contribute to the project/program roadmap in a short duration - Despite being a good team player , Be a good communicator and collaborate well within as well as outside the team. Profile required At least 4-6 Oracle SQL, PL/SQL, Oracle FORMS REPORTS, Shell scripting Excellent problem-solving and analytical skills Strong written and verbal communication skills Ability to work effectively in a team environment Desirable - BS degree in computer science or related subjects - Experience on end to end PL SQL activities Good understanding of the SDLC and having proven experience in applying those practices
Posted 1 month ago
1.0 - 5.0 years
3 - 7 Lacs
Bengaluru
Work from Office
ETRM Data Engineer: Key Responsibilities Design, develop, and maintain scalable data pipelines and ETRM systems. Work on data integration projects within the Energy Trading and Risk Management (ETRM) domain. Collaborate with cross-functional teams to integrate data from ETRM trading systems like Allegro, RightAngle, and Endur. Optimize and manage data storage solutions in Data Lake and Snowflake. Develop and maintain ETL processes using Azure Data Factory and Databricks. Write efficient and maintainable code in Python for data processing and analysis. Ensure data quality and integrity across various data sources and platforms. Ensure data accuracy, integrity, and availability across various trading systems. Collaborate with traders, analysts, and IT teams to understand data requirements and deliver robust solutions. Optimize and enhance data architecture for performance and scalability Mandatory Skills: Python/ pyspark Fast API Pydantic SQL Alchemy Snowflake or SQL Data Lake Azure Data Factory (ADF) CI\CD, Azure fundamentals , GIT Integration of data solutions with CETRM trading Systems(Allegro, RightAngle, Endur) Good to have: Databricks Streamlit Kafka Power BI Kubernetes Fast Stream
Posted 1 month ago
9.0 - 12.0 years
27 - 35 Lacs
Chennai, Bengaluru
Work from Office
Role and Responsibilities: Talk to client stakeholders, and understand the requirements for building their Business Intelligence dashboards and reports Design, develop, and maintain Power BI reports and dashboards for business users. Translate business requirements into effective visualizations using various data sources Create data models, DAX calculations, and custom measures to support business analytics needs Optimize performance and ensure data accuracy in Power BI reports. Troubleshoot and resolve issues related to transformations and visualizations. Train end-users on using Power BI for self-service analytics. Skills Required: Proficiency in Power BI Desktop and Power BI Service. Good understanding of Power BI Copilot. Strong understanding of data modelling concepts and DAX language. Strong understanding of semantic data modelling concepts. Experience with data visualization best practices. Experience in working with streaming data as well as batch data. Knowledge in ADF would be added advantage. Knowledge in SAS would be added advantage.
Posted 1 month ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Key Responsibilities : Client Engagement & Requirements Analysis:Lead discovery sessions with clients to gather detailed business requirements for financial planning, budgeting, forecasting, consolidation, and reporting processes Analyze existing financial processes and identify opportunities for optimization and automation using Oracle EPM capabilities Translate complex business requirements into clear, concise functional and technical design specifications Solution Design & Architecture:Design and architect robust, scalable, and efficient Oracle EPM solutions (eg, Planning & Budgeting Cloud Service - PBCS/EPBCS, Financial Consolidation and Close - FCCS, Narrative Reporting, Account Reconciliation - ARCS, profitability and Cost Management - PCMCS) Develop technical specifications for integrations, data loads, and reporting requirements Implementation & Configuration:Configure Oracle EPM applications based on approved design specifications, including dimensions, hierarchies, forms, business rules, calculations, security, and reports Develop and optimize financial reports and dashboards (eg, using Financial Reports, Smart View, Narrative Reporting) Implement data integrations using Data Management, Data Integration, EPM Automate, or other relevant tools Testing & Quality Assurance:Develop and execute comprehensive test plans (unit, system, integration, UAT) to ensure the solution meets business requirements and performs optimally Identify, track, and resolve defects and issues throughout the project lifecycle Training & Support:Provide user training and documentation to client teams, ensuring a smooth transition and user adoption Offer post-go-live support and troubleshooting, addressing client queries and issues promptly Project Management & Leadership:Manage project tasks, timelines, and deliverables for assigned modules or workstreams Provide technical leadership and guidance to junior consultants Communicate project status, risks, and issues effectively to clients and internal stakeholders Contribute to pre-sales activities, including solution demonstrations and effort estimations Knowledge & Mentorship:Stay current with the latest Oracle EPM product releases, features, and best practices Share knowledge and mentor junior team members
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
andhra pradesh
On-site
The company is a prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services. Headquartered in Bengaluru, it has a gross revenue of 222.1 billion with a global workforce of 234,054. The company is listed on NASDAQ and operates in over 60 countries, serving clients across various industries such as financial services, healthcare, manufacturing, retail, and telecommunications. The company has major delivery centers in India, including cities like Chennai, Pune, Hyderabad, Bengaluru, Kochi, Kolkata, and Noida. As a Paas Developer with 6-8 years of experience, the key responsibilities include integrating and extending Oracle Fusion ERP applications using PaaS services. This involves creating custom user interfaces, integrating with existing web services, and building custom business objects within Oracle Fusion applications. Additionally, the role requires designing and developing PaaS extensions for data synchronization between Oracle Fusion and external systems. The ideal candidate should have hands-on experience in building custom Java applications using PaaS services like Oracle Java Cloud Service (JCS) and Oracle Database Cloud Service (DBCS). They should also be proficient in developing PaaS extensions for creating custom business objects in DBaaS or ATP. Moreover, expertise in developing PaaS extensions using VBCS to create custom applications and UI elements that integrate with Oracle Fusion is essential. Furthermore, the candidate should possess knowledge and experience in PaaS extensions with appropriate security measures, such as using JWT UserToken. Experience in solution building in an environment where API Gateway, OIC, SOACS, and WebServices are deployed for data interfacing is required. Multiple implementation experience using SOA, Web Services, J2EE/JSF/Oracle ADF (Fusion Middleware) in EBS on-premise projects is a plus. The role also involves providing leadership to onshore and offshore resources in the project team, conducting customer-facing activities, including discovery sessions, solution workshops, presentations, and POCs. The candidate should have 6-8 years of Oracle Cloud technical experience and knowledge of reports using BI Publisher (RTF, XSL templates), OTBI analysis, and dashboards. Identifying and resolving technical issues related to Oracle Fusion applications and integrations is a key aspect of the role, as well as performance tuning and optimizing system performance. Additionally, the ideal candidate should have basic knowledge of Oracle SCM and Finance-related functional areas to effectively fulfill the responsibilities of the role.,
Posted 1 month ago
0.0 - 5.0 years
15 - 20 Lacs
Chennai
Work from Office
Job Title Tech Lead/Cloud ArchitectExperience 0-5 YearsLocation Remote : A NASDAQ-listed company that has effectively maintained its position as the front-runner In the food and beverage sector is looking to onboard A Tech Lead to guide and manage the development team on various projects. The Tech Lead will be responsible for overseeing the technical direction of the projects, Ensuring the development of high-quality, scalable, and maintainable code. The talent will be interacting with other talents as well as an internal cross-functional team. Required Skills: Cloud architecture using microservices design Data Modelling/Design API Design API Contracts React Java Azure ADO RESTful API, GraphQL, SQL/NoSQL DB Experience with ADF, Databricks CI/CD, Sonarqube, Snyk, Prometheus, Grafana Responsibilities: Collaborate with Product and Data teams. Ensure a clear understanding of requirements. Architect and design microservices based enterprise web application Build data intensive, UI-rich microservices based enterprise applications that is scalable, Performant, secure using Cloud best practices in Azure Offer Details: Full-time dedication (40 hours/week) REQUIRED3-hour overlap with CST (Central Standard Time) Interview Process: 2-step interviewinitial screening and technical interview
Posted 1 month ago
9.0 - 12.0 years
15 - 20 Lacs
Chennai
Work from Office
Job Title:Data Engineer Lead / Architect (ADF)Experience9-12YearsLocation:Remote / Hybrid : Role and ResponsibilitiesTalk to client stakeholders, and understand the requirements for building their data warehouse / data lake / data Lakehouse. Design, develop and maintain data pipelines in Azure Data Factory (ADF) for ETL from on-premise and cloud based sources Design, develop and maintain data warehouses and data lakes in Azure Run large data platform and other related programs to provide business intelligence support Design and Develop data models to support business intelligence solutions Implement best practices in data modelling and data warehousing Troubleshoot and resolve issues related to ETL and data connections Skills Required: Excellent written and verbal communication skills Excellent knowledge and experience in ADF Well versed with ADLS Gen 2 Knowledge of SQL for data extraction and transformation Ability to work with various data sources (Excel, SQL databases, APIs, etc.) Knowledge in SAS would be added advantage Knowledge in Power BI would be added advantage
Posted 1 month ago
3.0 - 5.0 years
5 - 8 Lacs
Gurugram
Work from Office
• Design, develop, and maintain web-based applications using Oracle ADF. • Build robust, scalable, and secure ADF applications that interact with backend systems. • Customize Oracle ADF components such as JSF, EJBs, ADF BC (Business Components) Required Candidate profile • Experience: Minimum of 3-4 years of professional experience in Oracle ADF development. • Oracle ADF: Strong expertise in developing and maintaining applications using Oracle Application Development
Posted 1 month ago
5.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Hybrid
Job Role Sr ADF Developer Location – Hyderabad Only Work Mode – Hybrid Experience – 5 to 9 Years Full-time with Sutherland Job Description Interpret requirements and design solution that satisfy the customer needs and also satisfied the need for standardization, usability and maintainability of the different Agent facing applications. Breakdown solution into components that could be assigned and tracked in pursuit of a fast and cost-effective solution. Must have experience in SQL/PL-SQL. Should have Implement ADF Business Components / Web Services / Oracle Objects Calls that would provide the data access layer for the Agent Applications processes. Implement the ADF View Controller components, including Tasks Flows, Beans and JSF pages that would allow the successful interaction between the agents and the applications. Supervise tasks performed by other members of the solution team in order to ensure a cohesive approach that satisfies the external (customer) requirements and internal (technology) requirements. Design and develop enhancements using exit-point based architecture within loan the origination/servicing system Designing and developing enhancements using exit-point based architecture within loan origination/servicing system Design and implement custom business applications using Oracle Application Development framework Produce, present and validate AS-IS and TO-BE landscape models for technical approval showing migration path Produce and validate solution design documentation for projects ensuring a consistent quality approach is maintained Input into the Technical Environment Plans (TEP) for projects Work directly with business users in all aspects of design and development Produce code that meets quality, coding and performance standards according to local IT guidelines and policies including Security, auditing and SOX requirements Ensure technical & supporting documentation is written to company standards Undertake Unit testing making full utilization of automation tools available and working with the system test team to ensure tests are integrated as part of the overall test plan
Posted 1 month ago
5.0 - 10.0 years
9 - 14 Lacs
Gurugram
Work from Office
Overview We are PepsiCo We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Purpose . For more information on PepsiCo and the opportunities it holds, visitwww.pepsico.com. Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Machine Learning Services and Pipelines. PepsiCo Data Analytics & AI Overview With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCos leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AIProducts. And will support pre-engagement activities as requested and validated by the prioritization framework of DA&AI. Data ScientistHyderabad and Gurugram You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Machine Learning and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 5+ years of experience working as a Data Scientist. 4+ years experience building solutions in the commercial or in the supply chain space. 4+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 4+ years experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 4+ years experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Skills, Abilities, Knowledge Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills Hands-on experience in statistical programming languages like Python , Pyspark and database query languages likeSQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pigis an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics product creation.
Posted 1 month ago
5.0 - 10.0 years
19 - 25 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 1 month ago
6.0 - 8.0 years
9 - 13 Lacs
Chennai
Work from Office
Job Title:Data Engineering Lead Experience6-8 YearsLocationChennai : ADF (Azure Data Factory) Azure Data Bricks Azure Synapse Strong ETL experience Power BI
Posted 1 month ago
6.0 - 9.0 years
8 - 11 Lacs
Noida, India
Work from Office
Job Summary: We are seeking a detail-oriented and analytical Business Analyst to join our team. The Business Analyst will be responsible for business users interactions, identifying business needs and analyzing /Documenting the requirements. The ideal candidate has strong problem-solving skills, excellent communication abilities, and a deep understanding of business processes and technology. Key Responsibilities: Work with stakeholders to gather, analyze, and document business requirements. Translate business needs into functional specifications for IT teams or external vendors. Analyze business processes and identify opportunities for improvement or automation. Conduct market, competitor, and trend analysis to support strategic decisions. Create detailed reports and presentations to communicate findings and recommendations. Facilitate workshops, meetings, and presentations to stakeholders. Assist in project planning, monitoring, and execution to ensure alignment with business goals. Collaborate with cross-functional teams including IT, finance, operations, and marketing. Support the testing and implementation of new systems or enhancements. Qualifications: Education & Experience: Bachelors degree in Business Administration, Information Systems, Finance, or a related field. 6-9 years of experience as a Business Analyst or similar role. Experience with healthcare system and integration platform Skills: Strong analytical and critical thinking skills. Proficiency in Microsoft Office (Excel, Word, PowerPoint); experience with tools like SQL, Postman tool and Azure is a plus. Familiarity with Agile, Scrum, or Waterfall methodologies. Excellent written and verbal communication skills. Ability to work independently and in a team environment. Strong organizational skills and attention to detail. Preferred Qualifications: Experience in a Healthcare domain. Mandatory Competencies BA - BA - Requirement Gathering BA - SQL Agile - Agile - SCRUM Beh - Communication and collaboration Enterprise Applications - ERP - Microsoft Dynamics 365 Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight
Posted 1 month ago
9.0 - 13.0 years
9 - 13 Lacs
Bengaluru, Bangalaore
Work from Office
Experience in Microsoft SQL Server database development (TSQL) Experience in building SSIS packages Good experience in creation LLDs.Experience delivering solutions utilizing the entire Microsoft BI stack (SSAS, SSIS) Experience with SQL Server/T-SQL programming in creation and optimization of stored procedures, triggers and user defined functions Microsoft SQL Server database development (TSQL) experience Experience working in a data warehouse environment and a strong understanding of dimensional data modeling concepts Must be able to build Business Intelligence solutions in a collaborative, agile development environment Strong understanding of Data Ingestion, Data processing, Orchestration, Parallelization, Transformation and ETL fundamentals Sound knowledge of data analysis using any SQL tools Experience in ADF, Synapse and other Azure components Designs develop, automates, and support complex applications to extract, transform, and load data Should have knowledge of error handling and Performance tuning for Data pipelines SkillSQL, SSIS, ADF, T-SQL, ETL & DW, Good Communication Qualifications Graduate Additional Information Work from office no cell phone policy Job Location
Posted 1 month ago
5.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Need self-driven and politically aware individual, ability to drive the ask and navigate difficult and noncommittal members, strong analytical skills, customer engagement and negotiations The person should have strong understanding of MS D365, experience in sourcing data from MS D365 for BI reporting, experience in using Synapse Link for data export from D365, experience in building data warehousing solutions using ADF, MS Azure, SQL cloud database with strong knowledge in various BI architecture principles. Should also have experience in ERP evaluation. Needs Assessment: -Collaborating with stakeholders across different departments to identify business requirements, pain points, and desired improvements. Solution Research: -Exploring various ERP systems and vendors, understanding their strengths and weaknesses, and assessing their suitability for the organization. Evaluation Framework: -Developing a structured approach to evaluate ERP systems based on factors like functionality, cost, implementation complexity, vendor reputation, and long-term scalability. Vendor Management: -Engaging with ERP vendors, conducting demos, and negotiating contracts. Recommendation & Implementation: -Providing a detailed analysis and recommendation of the best-fit ERP system, and potentially assisting with the initial stages of implementation. Example of ERP Evaluation Process: 1. Define Requirements: Gather input from all relevant departments to map out business processes and identify areas for improvement. 2. Research and Shortlisting: Explore potential ERP systems based on the defined requirements and industry best practices. 3. RFP Creation: Develop a Request for Proposal (RFP) to solicit detailed information from shortlisted vendors. 4. Vendor Demos and Presentations: Invite vendors to demonstrate their solutions and answer specific questions. 5. Detailed Analysis and Comparison: Compare the shortlisted systems based on functionality, cost, implementation, and other critical factors. 6. Final Selection and Contract Negotiation: Choose the most suitable ERP system and negotiate a favourable contract with the vendor. 7. Implementation Planning: Collaborate with the vendor and internal teams to develop a comprehensive implementation plan. Mandatory Skills: Microsoft Dynamics 365 Guides. Experience: 5-8 Years.
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
About The Role Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Spark AR StudioMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your typical day will involve collaborating with team members to ensure the successful execution of projects, performing maintenance and enhancements, and contributing to the development of innovative solutions that meet client needs. You will be responsible for delivering high-quality code while adhering to best practices and standards in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Python (Programming Language), Spark AR Studio.- Strong understanding of data analytics and data engineering principles.- Experience with cloud-based data solutions and architectures.- Familiarity with agile development methodologies.-Must Have Databricks , Python, Spark , ADF Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Pune
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Implement best practices for application design and development- Conduct code reviews and ensure code quality standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services, ADF, ADB, Pyspark- Strong understanding of cloud computing principles- Experience with Azure DevOps for continuous integration and deployment- Knowledge of Azure SQL Database and Azure Cosmos DB- Hands-on experience with Azure Functions and Logic Apps Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Data Services- This position is based at our Pune office (Kharadi) and 3 days WFO mandatory- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Python (Programming Language) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the highest standards of quality and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Strong understanding of application development frameworks.- Experience with database management and integration.- Familiarity with cloud computing platforms and services.- Ability to write clean, maintainable, and efficient code. Additional Information:- The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 8.0 years
5 - 10 Lacs
Kolkata
Work from Office
About The Role Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com.In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience.The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Qualification Any Graduation,12th/PUC/HSC
Posted 1 month ago
7.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Conduct technical analyses of existing data pipelines, ETL processes, and on-premises/cloud system, identify technical bottlenecks, evaluate migration complexities, and propose optimizations. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders Strong experience in Synapse Analytics, Databricks, ADF, Azure SQL (DW/DB), SSIS. Strong experience in Advanced PS, Batch Scripting, C# (.NET 3.0). Expertise on Orchestration systems with ActiveBatch and AZ orchestration tools. Strong understanding of data warehousing, DLs, and Lakehouse concepts. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess the complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Lead project deliverables, ensuring actionable and strategic outputs. Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City