Home
Jobs

977 Adf Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Dataops Engineer Location: Hyderabad Experience: 5-10 Years Required Skills Bachelor's degree in Computer Science, Information Systems, or a related field. 5-10 years of experience in Data Engineering or DataOps roles. Strong hands-on experience with: ADF, ADLS, Snowflake, Azure DevOps or similar CI/CD platforms Proficient in SQL and scripting languages such as Python.. Solid understanding of ETL/ELT concepts and data warehousing. Experience with source control (e.g., Git) and infrastructure as code (e.g., ARM templates, Terraform). Knowledge of data security best practices in cloud environments.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Description Bachelor's degree in Computer Science, Software Engineering, or a related field. 5+ years of experience as a Full Stack Developer. Strong proficiency in ReactJS, Next.js, C#, .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Responsibilities: Development: Contribute to the development of robust and scalable applications using our primary tech stack, including ReactJS, Next.js, C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives. Job Description Bachelor's degree in Computer Science, Software Engineering, or a related field. 5+ years of experience as a Full Stack Developer. Strong proficiency in ReactJS, Next.js, C#, .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Responsibilities: Development: Contribute to the development of robust and scalable applications using our primary tech stack, including ReactJS, Next.js, C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives.

Posted 5 days ago

Apply

8.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

TCS Hiring for Azure FinOps Experience: 8 to 10 Years Only Job Location: Kolkata/Pune TCS Hiring for Azure FinOps Required Technical Skill Set: As a Cloud FinOps consultant, you are responsible for developing and implementing a robust program for cloud cost management that includes: a service-based cost allocation and classification strategy, tracking and management of cloud cost, cloud services rates setting, and defining consumption and show back/chargeback reports from both the provider and consumer perspective. Experienced Cloud FinOps Practitioner who can work on mainly Azure along with AWS platforms for Cloud Optimization & Cost savings (Certification as a FinOps Practitioner will have an added advantage) Create cloud cost optimization framework and Governance mechanism. Who takes ownership in Cost Analysis, Reviewing Recommendations, creating budgeting alerts, purchasing reservations / savings plan, tagging, anomaly detection and forecasting spend. Experienced in rightsizing of computer services, identifying unused resources and AHUB implementation. Define and setup the Cloud Spend governance process and Cloud Spend Tracking mechanism. Experience in driving deep architectural discussions to help customers ensure they are making the most cost-efficient cloud usage choices. Interacting with vendors on Enterprise Agreements, MCA, Discounts and various other optimization opportunities. Creating dashboards for visualizing monthly, yearly cost by adding various filters. Drive a close working relationships with IT teams, Finance teams, Architects, Operations Knowledge of IaaS, PaaS services and cloud technologies, for example Databricks, AKS, ADF, Log Analytics, Load Balancing, Disaster recovery. Knowledge of app & data architectures and cloud native patterns for development, etc. Kind Regards, Priyankha M

Posted 5 days ago

Apply

0 years

25 Lacs

India

On-site

GlassDoor logo

Work Location: Kochi/Trivandrum 10-15+ Yrs Experience ▪ Expertise in Azure services including App Services, Functions, DevOps pipelines and ADF ▪ Expert-level knowledge of MuleSoft Anypoint Platform, API lifecycle management, and enterprise integration ▪ Proven skills in Java-based web applications with RDBMS or NoSQL backends ▪ Proven skills in Python and JavaScript for backend and full-stack solutions ▪ Deep understanding of object-oriented programming and design patterns ▪ Solid hands-on experience with RESTful API development and microservices architecture ▪ Familiarity with unit testing frameworks (e.g., TestNG) and integration testing in Java ▪ Experience with code review processes, test coverage validation, and CI/CD pipelines ▪ Proficiency in Git, SVN, and other version control systems ▪ Comfortable working with static code analysis tools and agile methodologies ▪ Good knowledge of JIRA, Confluence, and project collaboration tools ▪ Strong communication skills and ability to mentor team members ▪ Ability to prepare detailed technical documentation and design specifications ▪ Passion for clean code, best practices, and scalable architecture ▪ Nice to have: experience with identity providers like Auth0, Keycloak, IdentityServer. ▪ Take ownership of tasks and user stories; provide accurate estimations ▪ Provide technically sound advice/decisions on how to develop functionality to implement client’s business logic in software systems ▪ Lead sprint activities, including task management and code reviews ▪ Design and implement secure, scalable, and maintainable solutions ▪ Conduct technical discovery and performance analysis of critical systems ▪ Write low-level design and as-built documents ▪ Translate business requirements into technical specifications and working code ▪ Develop and maintain unit, integration, and regression test cases ▪ Ensure adherence to TDD and promote high code coverage ▪ Integrate multiple data sources and systems with dissimilar structures ▪ Contribute to overall system design with focus on security and resilience ▪ Use static code analysis tools and set up CI/CD pipelines ▪ Participate in Agile ceremonies: grooming, planning, stand-ups, and retrospectives ▪ Collaborate with technical architects on solution design ▪ Mentor junior team members on coding standards, sprint tasks, and technology ▪ Troubleshoot, test, and optimize core software and databases ▪ Stay updated with industry trends and promote best practices within the team. ▪ Identify challenges, initiate PoCs, and perform feasibility studies. Participate in the full product development cycle, including brainstorming, release planning and estimation, implementing and iterating on code, coordinating with internal and external clients, internal code and design reviews, MVP and production releases, quality assurance, and product support. Highly effective and thrive in a dynamic environment. Comfortable with proactive outward communication and technical leadership and positive about accepting challenges. Adhere to ISMS Policies and procedures. Job Type: Full-time Pay: From ₹2,500,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Work Location: In person

Posted 5 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greetings from TCS!!! TCS is hiring for Snowflake Tech Architect / Tech Lead Experience: 10+ years Location: Chennai/Bangalore/Mumbai Required Technical Skill Set 10 years of total experience and at least 3+ years of expertise in Cloud data warehouse technologies on Snowflake and AWS, Azure or GCP. At least one End-to-end Snowflake implementation is a must covering all aspects including architecture, design, data engineering, data visualization and data governance (specifically data quality and lineage). Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses and Data Marts. Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Significant experience with data migrations and development, design, Operational Data Stores, Enterprise Data Warehouses and Data Marts. Experience with cloud ETL and ELT in one of the tools like DBT/Glue/ADF or Matillion or any other ELT tool and exposure to Bigdata ecosystem (Hadoop). Expertise with at least one of the Traditional data warehouses solutions on Oracle, Teradata, and Microsoft SQL server. Excellent communication skills to liaise with Business & IT stakeholders. Expertise in planning execution of a project and efforts estimation. Understanding of Data Vault, data mesh and data fabric architecture patterns. Exposure to working in Agile ways of working. Must-Have Experience on cloud services like S3/Blob/GCS, Lambda, Glue/ADF/, Apache Airflow. Experience or understanding of Banking and Financial Services business domain. Good-to-Have Experience in coding languages like Python and PySpark would be an added advantage. Experience in DevOps, CI/CD, GitHub is a big plus. Responsibility of / Expectations from the Role 1.Provide Technical pre-sales enablement on data on cloud aspects covering data architecture, data engineering, data modelling, data consumption and data governance aspects focusing on Snowflake. 2.Expert level knowledge on Snowflake Data engineering, performance, consumption, security, governance and admin aspects 3.To work with cross-functional teams in onsite/offshore setup and discuss and solve technical problems with various stakeholders including customer teams. 4.Creating technical proposals and responding to large scale RFPs. 5.Have discussion on existing solution, design/optimize solution and prepare execution planning for development, deployment and enabling end users to utilize the data platform. 6.Role demands excellent oral and written communication skills to organize workshops, meetings with account teams, account leadership and senior stakeholders from client including CXO level. 7.Adept in creating POV and conduct PoC. 8.Liaise with Technology partners like Snowflake, Matillion, DBT etc.,

Posted 5 days ago

Apply

45.0 - 50.0 years

0 Lacs

India

On-site

GlassDoor logo

Bombay Mercantile Co-Operative Bank Ltd., a leading Multi-State Scheduled Bank, with 52 branches across 10 states, requires dynamic and experience personnel. Age: 45-50 Years Location: Mumbai Qualification and Experience: Graduate/Postgraduate in Computer Science, Information Systems, Data Analytics, Statistics, or a related field. Experience with BI tools such as Tableau, Power BI, or similar is an added advantage. Minimum 10–15 years of relevant experience in MIS, with at least 5 years in a leadership role, preferably in a cooperative or public sector bank. Knowledge of CBS systems, RBI reporting portals, data analytics tools, and SQL/database management essential. Key Responsibilities: 1. MIS Strategy & Planning Develop and implement an effective MIS framework to support strategic and operational objectives. Ensure integration of MIS with Core Banking System (CBS), Loan Origination System (LOS), and other internal systems for seamless data flow. 2. Data Collection, Processing & Reporting Design, standardize, and maintain reporting formats for daily, weekly, monthly, and quarterly reporting across departments. Ensure timely generation of reports for internal management, Board of Directors, auditors, and regulators. Prepare and submit statutory and compliance reports to RBI, NABARD, State Registrar, etc. 3. Regulatory & Compliance Reporting Ensure all RBI-mandated MIS submissions (e.g., CRILC, XBRL, returns under ADF, etc.) are accurate and timely. Track regulatory changes and incorporate them into reporting frameworks. 4. Performance & Operational Dashboards Develop real-time dashboards and KPIs for key functions such as credit, deposits, NPA tracking, branch performance, etc. Provide analytics support to business heads for performance analysis and forecasting. 5. Data Governance & Quality Maintain high standards of data integrity, consistency, and security across systems. Conduct regular audits and validations of MIS data to identify and correct discrepancies. 6. System Enhancement & Automation Liaise with IT department and software vendors to automate recurring reports. Support implementation of business intelligence (BI) tools, data warehouses, and report automation solutions. 7. Support to Management Assist senior management with ad-hoc analysis, strategic reviews, and Board-level presentations. Provide MIS support for product planning, regulatory inspections, audits, and business strategy. Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Masjid, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: Management Information Systems: 10 years (Preferred) Work Location: In person

Posted 5 days ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

TCS Hiring for Azure FinOps Experience: 8 to 10 Years Only Job Location: Kolkata/Pune TCS Hiring for Azure FinOps Required Technical Skill Set: As a Cloud FinOps consultant, you are responsible for developing and implementing a robust program for cloud cost management that includes: a service-based cost allocation and classification strategy, tracking and management of cloud cost, cloud services rates setting, and defining consumption and show back/chargeback reports from both the provider and consumer perspective. Experienced Cloud FinOps Practitioner who can work on mainly Azure along with AWS platforms for Cloud Optimization & Cost savings (Certification as a FinOps Practitioner will have an added advantage) Create cloud cost optimization framework and Governance mechanism. Who takes ownership in Cost Analysis, Reviewing Recommendations, creating budgeting alerts, purchasing reservations / savings plan, tagging, anomaly detection and forecasting spend. Experienced in rightsizing of computer services, identifying unused resources and AHUB implementation. Define and setup the Cloud Spend governance process and Cloud Spend Tracking mechanism. Experience in driving deep architectural discussions to help customers ensure they are making the most cost-efficient cloud usage choices. Interacting with vendors on Enterprise Agreements, MCA, Discounts and various other optimization opportunities. Creating dashboards for visualizing monthly, yearly cost by adding various filters. Drive a close working relationships with IT teams, Finance teams, Architects, Operations Knowledge of IaaS, PaaS services and cloud technologies, for example Databricks, AKS, ADF, Log Analytics, Load Balancing, Disaster recovery. Knowledge of app & data architectures and cloud native patterns for development, etc. Kind Regards, Priyankha M

Posted 5 days ago

Apply

5.0 - 9.0 years

3 - 9 Lacs

Chennai

On-site

GlassDoor logo

Technical Lead Chennai 5-9 Years INDIA Job Family Practice (Digital) Job Description (Posting). To be responsible for managing technology in projects and providing technical guidance / solutions for work completion. (1.) To develop and guide the team members in enhancing their technical capabilities and increasing productivity (2.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. (3.) To be responsible for providing technical guidance / solutions ;define, advocate, and implement best practices and coding standards for the team. (4.) To ensure process compliance in the assigned module| and participate in technical discussions/review as a technical consultant for feasibility study (technical alternatives, best packages, supporting architecture best practices, technical risks, breakdown into components, estimations). Qualification B-Tech No. of Positions 1 Skill (Primary) Oracle (APPS)-Oracle E- Business Suite Technical-ADF Auto req ID 1541137BR Skill Level 3 (Secondary Skill 1) Technical Skills (APPS)-Datawarehouse-Extract Transform Load (ETL) Automation

Posted 5 days ago

Apply

5.0 years

15 - 24 Lacs

Bengaluru

On-site

GlassDoor logo

Job Title: Senior Data Engineer – Azure | ADF | Databricks | PySpark | AWS Location: Bangalore, Hyderabad, Chennai (Hybrid Mode) Experience Required: 5+ Years Notice Period: Immediate Job Description We are looking for a Senior Data Engineer who is passionate about designing and developing scalable data pipelines, optimizing data architecture, and working with advanced big data tools and cloud platforms. This is a great opportunity to be a key player in transforming data into meaningful insights by leveraging modern data engineering practices on Azure, AWS, and Databricks . You will be working with cross-functional teams including data scientists, analysts, and software engineers to deliver robust data solutions. The ideal candidate will be technically strong in Azure Data Factory, PySpark, Databricks, and AWS services , and will have experience in building end-to-end ETL workflows and driving business impact through data. Key Responsibilities Design, build, and maintain scalable and reliable data pipelines and ETL workflows Implement data ingestion and transformation using Azure Data Factory (ADF) and Azure Databricks (PySpark) Work across multiple data platforms including Azure, AWS, Snowflake, and Redshift Collaborate with data scientists and business teams to understand data needs and deliver solutions Optimize data storage, processing, and retrieval for performance and cost-effectiveness Develop data quality checks and monitoring frameworks for pipeline health Ensure data governance, security, and compliance with industry standards Lead code reviews, set data engineering standards, and mentor junior team members Propose and evaluate new tools and technologies for continuous improvement Must-Have Skills Strong programming skills in Python , SQL , or Scala Azure Data Factory , Azure Databricks , Synapse Analytics Hands-on with PySpark , Spark, Hadoop, Hive Experience with cloud platforms (Azure preferred; AWS/GCP acceptable) Data Warehousing: Snowflake , Redshift , BigQuery Strong ETL/ELT pipeline development experience Workflow orchestration tools such as Airflow , Prefect , or Luigi Excellent problem-solving, debugging, and communication skills Nice to Have Experience with real-time streaming tools (Kafka, Flink, Spark Streaming) Exposure to data governance tools and regulations (GDPR, HIPAA) Familiarity with ML model integration into data pipelines Containerization and CI/CD exposure: Docker, Git, Kubernetes (basic) Experience with Vector databases and unstructured data handling Technical Environment Programming: Python, Scala, SQL Big Data Tools: Spark, Hadoop, Hive Cloud Platforms: Azure (ADF, Databricks, Synapse), AWS (S3, Glue, Lambda), GCP Data Warehousing: Snowflake, Redshift, BigQuery Databases: PostgreSQL, MySQL, MongoDB, Cassandra Orchestration: Apache Airflow, Prefect, Luigi Tools: Git, Docker, Azure DevOps, CI/CD pipelines Soft Skills Strong analytical thinking and problem-solving abilities Excellent verbal and written communication Collaborative team player with leadership qualities Self-motivated, organized, and able to manage multiple projects Education & Certifications Bachelor’s or Master’s Degree in Computer Science, IT, Engineering, or equivalent Cloud certifications (e.g., Microsoft Azure Data Engineer, AWS Big Data) are a plus Key Result Areas (KRAs) Timely delivery of high-performance data pipelines Quality of data integration and governance compliance Business team satisfaction and data readiness Proactive optimization of data processing workloads Key Performance Indicators (KPIs) Pipeline uptime and performance metrics Reduction in overall data latency Zero critical issues in production post-release Stakeholder satisfaction score Number of successful integrations and migrations Job Types: Full-time, Permanent Pay: ₹1,559,694.89 - ₹2,441,151.11 per year Benefits: Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Application Question(s): What is your notice period in days? Experience: Azure Data Factory, Azure Databricks, Synapse Analytics: 5 years (Required) Python, SQL, or Scala: 5 years (Required) Work Location: In person

Posted 5 days ago

Apply

7.0 - 12.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

TECHNICAL SKILLS AND EXPERIENCE Most important: 7+ years professional experience as a data engineer, with at least 4 utilizing cloud technologies. Proven experience building ETL or ETL data pipelines with Databricks either in Azure or AWS using PySpark language. Strong experience with the Microsoft Azure Data Stack (Databricks, Data Lake Gen2, ADF etc.) Strong SQL skills and proficiency in Python adhering to standards such as PEP Proven experience with unit testing and applying appropriate testing methodologies using libraries such as Pytest, Great Expectations, or similar. Demonstrable experience with CICD including release and test automation tools and processes such as Azure Devops, Terraform, Powershell and Bash scripting or similar. Strong understanding of data modeling, data warehousing, and OLAP concepts. Excellent technical documentation skills. Preferred candidate profile

Posted 5 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Tech skills Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL Strong background in distributed data processing and storage (e.g. Apache Spark, Hadoop) Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines Development experience with at least one cloud (Azure high preference, AWS, GCP) Knowledge of data lake and data lake house patterns Knowledge of ETL performance tuning and cost optimization Knowledge of data structures and algorithms and good software engineering practices Soft skills Strong communication skills to articulate complex situation concisely Comfortable with picking up new technologies independently Eye for detail, good data intuition, and a passion for data quality Comfortable working in a rapidly changing environment with ambiguous requirements Skills Python,Sql,Aws,Azure

Posted 5 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Location: Hyderabad, Indore and Ahmedabad (India) What You Will Do: Following are high level responsibilities that you will play but not limited to: · Develop and maintain data pipelines using Azure Data Factory (ADF) /Databricks for data integration and ETL processes. · Design, implement, and optimize Power BI /Fabric reports and dashboards to deliver actionable business insights. · Collaborate with data analysts, business users, and other teams to understand data requirements and deliver solutions using ADF and Power BI. · Extract, transform, and load (ETL) data from various sources into cloud-based storage systems such as Azure Data Lake or Azure SQL Database. · Work with large datasets and optimize queries and pipelines for performance and scalability. · Ensure data quality, integrity, and availability throughout the data lifecycle. · Automate repetitive data tasks, ensuring timely and accurate reporting. · Monitor and troubleshoot data pipelines, addressing any performance or data issues promptly. · Support data visualization and reporting tools, including Power BI, to enable business stakeholders to make data-driven decisions. · Write clear, efficient, and maintainable code for data transformations and automation. Required Qualifications: · Bachelor's degree in computer science, Information Technology, Engineering, or a related field. · 8+ years of hands-on experience in Data Engineering, BI Developer or a similar role. · Proficiency with Azure Data Factory (ADF), including the creation of data pipelines and managing data flows. · Strong experience in Power BI, including report creation, dashboard development, and data modeling. · Experience with SQL and database management (e.g., Azure SQL Database, SQL Server). · Knowledge of cloud platforms, especially Microsoft Azure. · Familiarity with data warehousing concepts and ETL processes. · Experience working with cloud-based data storage solutions (e.g., Azure Data Lake, Azure Blob Storage). · Strong programming skills in languages such as Python, SQL, or other relevant languages. · Ability to troubleshoot and optimize data pipelines for performance and reliability. Preferred Qualifications: · Familiarity with data modeling techniques and practices for Power BI. · Knowledge of Azure Databricks or other data processing frameworks. · Knowledge of Microsoft Fabric or other Cloud Platforms. What we need? · B. Tech computer science or equivalent. Why join us? · Work with a passionate and innovative team in a fast-paced, growth-oriented environment. · Gain hands-on experience in content marketing with exposure to real-world projects. · Opportunity to learn from experienced professionals and enhance your marketing skills. · Contribute to exciting initiatives and make an impact from day one. · Competitive stipend and potential for growth within the company. · Recognized for excellence in data and AI solutions with industry awards and accolades.

Posted 5 days ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Role -: Azure Integration Engineer Exp -: 5 to 8 yrs Location -: Indore Immediate Joiners preferred Must Have-: Proficiency in Azure Logic Apps, Azure API Management, Azure Service Bus, Azure Event Grid, ADF, C#.NET and Azure Functions. Experience with JSON, XML, and other data format Working experience with Azure DevOps and GitHub Knowledge of Integration monitoring and lifecycle management Roles & Responsibilities -: Designing, developing, and deploying integration workflows using Azure Logic Apps. Creating and managing APIs using Azure API Management. Developing event-driven solutions with Azure Event Grid and Azure Service Bus. Building serverless functions with Azure Functions to support integration logic. Developing data transformations and mappings. Implementing integration patterns such as API integration, message queuing, and event-driven architecture. Working with different data formats (e.g., JSON, XML) & protocols (SOAP, REST etc) Perform UT & Help with integration testing. Support UAT

Posted 5 days ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Greetings from TCS!!! TCS is hiring for Azure Data Engineer Exp: 8-10 years Location: Kolkata/Pune/Mumbai/Bangalore Must-Have Strong experience in Azure Data Factory, ADB (Azure Databricks) Synapse, pyspark; establishing the cloud connectivity between different system like ADLS, ADF, Synapse, Databricks etc. A minimum of 7 years' experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partners, Minimum 7 years of troubleshooting and supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tuning, ETL importing large volume of data extracted from multiple systems, capacity planning Experience in TSQL programming along with Azure Data Factory framework and Python scripting Work well independently as well as within a team Proactive, organized, excellent analytical and problem-solving skills Flexible and willing to learn, can-do attitude is key Strong verbal and written communication skills Good-to-Have Financial institution data mart experience is an asset. Experience in .net application is an asset Experience and expertise in Tableau driven dashboard design is an asset Responsibility of / Expectations from the Role Azure Data Engineer (ADF,ADB) ETL processes using frameworks like Azure Data Factory or Synapse or Databricks; Establishing the cloud connectivity between different system like ADLS ,ADF, Synapse, Databricks etc TSQL programming along with Azure Data Factory framework and Python scripting

Posted 5 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title: Lead Technical Architect (Strategy & Optimization Data Lake & Analytics) Responsibilities: · Manage Project Delivery, scope, timelines, budget, resource allocation, and risk mitigation. · Develop and maintain robust data ingestion pipelines (batch, streaming, API). Provide architectural inputs during incident escalations and act as final authority for RCA documentation and closure. of ADF, Power BI, and Databricks · Define and enforce data governance, metadata, and quality standards across zones. · Monitor performance, optimize data formats (e.g., Parquet), and tune for cost-efficiency. Tune query performance for Databricks and Power BI datasets using optimization techniques (e.g. caching, BI Engine, materialized views). · Lead and mentor a team of data engineers, fostering skills in Azure services and DevOps. Guide schema designs for new datasets and integrations aligned with Diageo’s analytics strategy. · Coordinate cross-functional stakeholders (security, DevOps, business) for aligned execution. · Oversee incident and change management with SLA adherence and continuous improvement. Serve as the governance owner for SLA compliance, IAM policies, encryption standards, and data retention strategies. · Ensure compliance with policies (RBAC, ACLs, encryption) and regulatory audits. Initial data collection for RCA · Report project status, KPIs, and business value to senior leadership. Lead monthly and quarterly reviews, presenting insights, improvements, and roadmap alignment to Diageo stakeholders. Required Skills · Strong architecture-level expertise in Azure Data Platform (ADLS, ADF, Databricks, Synapse, Power BI). · Deep understanding of data lake zone structuring, data lineage, metadata governance, and compliance (e.g., GDPR, ISO). · Expert in Spark, PySpark, SQL, JSON, and automation tooling (ARM, Bicep, Terraform optional). · Capable of aligning technical designs with business KPIs and change control frameworks. · Excellent stakeholder communication, team mentoring, and leadership capabilities.

Posted 6 days ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior BI Developer (Microsoft BI Stack) Location: REMOTE Experience: 5+ years Employment Type: Full-Time Job Summary We are looking for an experienced Senior BI Developer with strong expertise in the Microsoft BI Stack (SSIS, SSRS, SSAS) to join our dynamic team. The ideal candidate will design and develop scalable BI solutions and contribute to strategic decision-making through efficient data modeling, ETL processes, and insightful reporting. Key Responsibilities Design and develop ETL packages using SSIS for data extraction, transformation, and loading from diverse sources. Create and maintain dashboards and reports using SSRS and Power BI (if applicable). Implement and manage OLAP cubes and data models using SSAS (Multidimensional/Tabular). Develop and optimize complex T-SQL queries, stored procedures, and functions. Work closely with business analysts, data engineers, and stakeholders to gather requirements and translate them into technical solutions. Optimize BI solutions for performance and scalability. Lead BI architecture improvements and ensure efficient data flow. Ensure data quality, integrity, and consistency across systems. Mentor and support junior BI developers as needed. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, or a related field. 5+ years of hands-on experience with the Microsoft BI Stack: SSIS, SSRS, SSAS. Strong knowledge of SQL Server (2016 or later) and advanced T-SQL. Deep understanding of data warehousing concepts, including star/snowflake schemas, and fact/dimension models. Experience with Power BI is a plus. Exposure to Azure Data Services (ADF, Azure SQL, Synapse) is an added advantage. Strong analytical, troubleshooting, and problem-solving skills. Excellent verbal and written communication skills. Why Join Us? Opportunity to work on enterprise-scale BI projects. Supportive work environment with career growth potential. Exposure to modern BI tools and cloud technologies. Skills: azure,power bi,ssis,t-sql,ssrs,data,ssas,sql,sql server,azure data services

Posted 6 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Information Technology expert with 5+ years of banking domain knowledge. Banking application development and implementation experience across offshore & onshore model including but not limited to Solution Design, Development and Support activities. 4+ years of Application Design and Development experience with Oracle Banking Platform (OBP) product for Lending & Deposit products, Origination Workflow for online as well as batch solutions & integrations. Hands on experience on Java, J2EE, ADF, SOA, OSB, Oracle Fusion & OBP technologies. Working experience with different SDLC phases from Analysis, design, development, and production support using Agile Methodology. 3+ years of experience with Automation testing tools like Selenium Automation testing using tools like Selenium Building Framework components and Business Process Patterns Closely work with Technical Solution Architect and process Functional Lead Technologies: JAVA, J2EE, Oracle BPM, SOA, JBoss BPM, API, Microservices Key Contribution in Solution analysis & redesign of application components in stabilizing OBP platform mainly for Oracle BPM, Oracle SOA & Oracle ADF technology components. Solution Design and Implementation for Application retrofits and migration to new Oracle hardware Key Bug-fixes and solution design and review with Oracle Banking Platform product teams. Automation design in various solution components within the Involved in the implementation of Oracle BPM, OBP Host, OBP UI solution for OBP Platform. Application support and enhancements - Production issue root cause analysis and solution design for the support fixes and enhancements Multi environment deployment and testing co-ordination for live system changes At least 5+ years of banking domain knowledge. What’s in it for you? We are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our world: Our people, our culture, our voices, and our passions. What’s better than building the next big thing? It’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purpose: to navigate further, together. EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin At Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch. Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Must have: Primary Skills : 8+ years of hands-on development experience with: • C#, .NET Core 6/8+, Entity Framework / EF Core • JavaScript, jQuery, REST APIs • Expertise in MS SQL Server, including: • Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types • Skilled in unit testing with XUnit, MSTest • Strong in software design patterns, system architecture, and scalable solution design • Ability to lead and inspire teams through clear communication, technical mentorship, and ownership 2+ years of hands with Azure Cloud Services, including: • Azure Functions • Azure Durable Functions • Azure Service Bus, Event Grid, Storage Queues • Blob Storage, Azure Key Vault, SQL Azure • Application Insights, Azure Monitoring Nice to have: • Familiarity with AngularJS, ReactJS, and other front-end frameworks • Experience with Azure API Management (APIM) • Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps • Exposure to Application Support and operational monitoring • Azure DevOps - CI/CD pipelines (Classic / YAML) Working hours- Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours

Posted 6 days ago

Apply

0.0 - 10.0 years

0 Lacs

Masjid, Mumbai, Maharashtra

On-site

Indeed logo

Bombay Mercantile Co-Operative Bank Ltd., a leading Multi-State Scheduled Bank, with 52 branches across 10 states, requires dynamic and experience personnel. Age: 45-50 Years Location: Mumbai Qualification and Experience: Graduate/Postgraduate in Computer Science, Information Systems, Data Analytics, Statistics, or a related field. Experience with BI tools such as Tableau, Power BI, or similar is an added advantage. Minimum 10–15 years of relevant experience in MIS, with at least 5 years in a leadership role, preferably in a cooperative or public sector bank. Knowledge of CBS systems, RBI reporting portals, data analytics tools, and SQL/database management essential. Key Responsibilities: 1. MIS Strategy & Planning Develop and implement an effective MIS framework to support strategic and operational objectives. Ensure integration of MIS with Core Banking System (CBS), Loan Origination System (LOS), and other internal systems for seamless data flow. 2. Data Collection, Processing & Reporting Design, standardize, and maintain reporting formats for daily, weekly, monthly, and quarterly reporting across departments. Ensure timely generation of reports for internal management, Board of Directors, auditors, and regulators. Prepare and submit statutory and compliance reports to RBI, NABARD, State Registrar, etc. 3. Regulatory & Compliance Reporting Ensure all RBI-mandated MIS submissions (e.g., CRILC, XBRL, returns under ADF, etc.) are accurate and timely. Track regulatory changes and incorporate them into reporting frameworks. 4. Performance & Operational Dashboards Develop real-time dashboards and KPIs for key functions such as credit, deposits, NPA tracking, branch performance, etc. Provide analytics support to business heads for performance analysis and forecasting. 5. Data Governance & Quality Maintain high standards of data integrity, consistency, and security across systems. Conduct regular audits and validations of MIS data to identify and correct discrepancies. 6. System Enhancement & Automation Liaise with IT department and software vendors to automate recurring reports. Support implementation of business intelligence (BI) tools, data warehouses, and report automation solutions. 7. Support to Management Assist senior management with ad-hoc analysis, strategic reviews, and Board-level presentations. Provide MIS support for product planning, regulatory inspections, audits, and business strategy. Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Masjid, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: Management Information Systems: 10 years (Preferred) Work Location: In person

Posted 6 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Information Technology expert with 5+ years of banking domain knowledge. Banking application development and implementation experience across offshore & onshore model including but not limited to Solution Design, Development and Support activities. 4+ years of Application Design and Development experience with Oracle Banking Platform (OBP) product for Lending & Deposit products, Origination Workflow for online as well as batch solutions & integrations. Hands on experience on Java, J2EE, ADF, SOA, OSB, Oracle Fusion & OBP technologies. Working experience with different SDLC phases from Analysis, design, development, and production support using Agile Methodology. 3+ years of experience with Automation testing tools like Selenium Automation testing using tools like Selenium Building Framework components and Business Process Patterns Closely work with Technical Solution Architect and process Functional Lead Technologies: JAVA, J2EE, Oracle BPM, SOA, JBoss BPM, API, Microservices Key Contribution in Solution analysis & redesign of application components in stabilizing OBP platform mainly for Oracle BPM, Oracle SOA & Oracle ADF technology components. Solution Design and Implementation for Application retrofits and migration to new Oracle hardware Key Bug-fixes and solution design and review with Oracle Banking Platform product teams. Automation design in various solution components within the Involved in the implementation of Oracle BPM, OBP Host, OBP UI solution for OBP Platform. Application support and enhancements - Production issue root cause analysis and solution design for the support fixes and enhancements Multi environment deployment and testing co-ordination for live system changes At least 5+ years of banking domain knowledge. What’s in it for you? We are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our world: Our people, our culture, our voices, and our passions. What’s better than building the next big thing? It’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purpose: to navigate further, together. EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin At Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch. Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department.

Posted 6 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Avensys is a reputed global IT professional services company headquartered in Singapore. Our service spectrum includes enterprise solution consulting, business intelligence, business process automation and managed services. Given our decade of success, we have evolved to become one of the top trusted providers in Singapore and service a client base across banking and financial services, insurance, information technology, healthcare, retail and supply chain. * Singapore Onsite *looking for only short term notice candidates Job Description: We are looking for an experienced Oracle IAM Consultant expertise to join our dynamic team Design and Implementation: Designing and implementing IAM solutions using OIM /OIG Developing custom connectors to integrate OIM with various applications and systems. Building and configuring OIM workflows, approval policies, and entitlements. Developing custom UI components for OIM self-service pages. Skills and Experience: Experienced in an end-to-end integration of IAM Solution using Oracle Identity governance. Prior experience with requirement gathering, analysis, design, development, maintenance, and upgrades in different environments like DEV, QA, UAT, PROD. Experience with ICF Based Framework connector to integrate with target applications and perform CRUD Operations and managing roles to the Target system. Extended hands-on experience with custom code development such as Event Handlers, Validation Plugin and Schedule Tasks using Java API. Experience with Audit reports with OIM BI Publisher and customized the logo and header of the UI Screen and audit reports. Implement Oracle ADF customizations for user interfaces. Build custom Oracle SOA composites for workflows. Java Experience: Best practice based secure java development Exposure and hands on experience with REST APIs and web services Ability to re-use existing code and extend frameworks Administration and Management: Administering and managing OIM environments. Ensuring the IAM platform is secure, scalable, and supports business requirements. Monitoring the performance and health of IAM systems. Security and Compliance: Developing and enforcing IAM policies and procedures. Collaborating with security teams to address vulnerabilities. Support and Troubleshooting: Supporting end-users with access-related issues and requests. Troubleshooting and resolving technical issues related to OIM implementation. Good to Have: Hands-on experience with Oracle Access manager Good understanding of AS400 and relevant infrastructure Unix Scripting String SQL knowledge WHAT’S ON OFFER You will be remunerated with an excellent base salary and entitled to attractive company benefits. Additionally, you will get the opportunity to enjoy a fun and collaborative work environment, alongside a strong career progression. To submit your application, please apply online or email your UPDATED CV to swathi@aven-sys.com Your interest will be treated with strict confidentiality. CONSULTANT DETAILS: Consultant Name : Swathi Avensys Consulting Pte Ltd EA Licence 12C5759 Privacy Statement: Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and Avensys' privacy policy

Posted 6 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title TECHNICAL ANALYST Job Description Job Title: ADF Data Engineer Responsibilities: Experience: 5 to 10 Yrs of Experience. • Convert Workato recipes into Azure Data Factory (ADF) pipelines to facilitate seamless data integration. • Design, develop, and maintain ADF pipelines to connect and orchestrate data flow between Snowflake and Salesforce. • Collaborate with cross-functional teams to understand data requirements and ensure efficient data integration. • Optimize data pipelines for performance, scalability, and reliability. • Implement data quality checks and monitoring to ensure data accuracy and consistency. • Troubleshoot and resolve issues related to data integration and pipeline performance. • Document data integration processes and maintain up-to-date technical documentation. Qualifications: • Bachelor's degree in Computer Science, Information Technology, or a related field. • Proven experience as a Data Engineer, with a focus on Azure Data Factory, data pipelines and data integration. • Strong proficiency in SQL and experience working with Snowflake and Salesforce. • Knowledge of ETL/ELT processes and best practices. • Familiarity with data warehousing concepts and cloud-based data solutions. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills. Preferred Qualifications: • Experience with other Azure services such as Azure Data Lake, Azure Synapse Analytics, and Azure Functions. • Certification in Azure Data Engineering or related fields. • Experience with version control systems like Git. • Experience with Workato or similar integration platforms.

Posted 6 days ago

Apply

8.0 years

2 - 4 Lacs

Bengaluru

On-site

GlassDoor logo

If you are looking for a challenging and exciting career in the world of technology, then look no further. Skyworks is an innovator of high performance analog semiconductors whose solutions are powering the wireless networking revolution. At Skyworks, you will find a fast-paced environment with a strong focus on global collaboration, minimal layers of management and the freedom to make meaningful contributions in a setting that encourages creativity and out-of-the-box thinking. We are excited about the opportunity to work with you and glad you want to be part of a team of talented individuals who together can change the way the world communicates. Requisition ID: 75243 Header If you are looking for a challenging and exciting career in the world of technology, then look no further. Skyworks is an innovator of high performance analog semiconductors whose solutions are powering the wireless networking revolution. At Skyworks, you will find a fast-paced environment with a strong focus on global collaboration, minimal layers of management and the freedom to make meaningful contributions in a setting that encourages creativity and out-of-the-box thinking. Our work culture values diversity, social responsibility, open communication, mutual trust and respect. We are excited about the opportunity to work with you and glad you want to be part of a team of talented individuals who together can change the way the world communicates. Requisition ID: 75243 Description We are seeking a highly skilled and experienced Sr. Principal Enterprise Integration Architect to join our team. The ideal candidate will have a strong background in enterprise integration architecture and extensive experience working with global teams and people. This role is critical in ensuring that all applications company-wide are managed into a world-class portfolio. The Sr. Principal Enterprise Integration Architect will play a pivotal role in designing, architecting, developing, and supporting integration solutions globally. Responsibilities Lead the design and implementation of enterprise integration solutions using Azure iPaaS or similar middleware tools. Collaborate with global teams to ensure seamless integration of applications across the company. Develop and maintain integration architecture standards and best practices. Manage the integration portfolio, ensuring all applications are aligned with the company's strategic goals. Provide technical leadership and guidance to the integration team. Oversee the development and support of integration solutions, ensuring high availability and performance. Conduct regular reviews and assessments of integration solutions to identify areas for improvement. Work closely with stakeholders to understand business requirements and translate them into technical solutions. Ensure compliance with security and regulatory requirements in all integration solutions. Required Experience and Skills Minimum of 8 years of experience in enterprise architecture, integration, software development or a related field. Exposure to native cloud platforms such as Azure, AWS, or GCP and experience with them at scale. Integration and Data Feeds: Support and maintain existing integration and data feeds, ensuring seamless data flow and system integration. Azure iPaaS Development: (or similar middleware product) : Design, develop, and maintain applications using Azure Integration Platform as a Service (iPaaS) components such as Logic Apps, Azure Data Factory (ADF), and Function Apps & APIM SQL and ETL Processes: Develop and optimize SQL queries and manage database systems for ETL processes. DevOps Management: Implement and manage DevOps pipelines using Git, Jenkins, Azure DevOps, and GitHub. Support and maintain Jenkins servers, Azure DevOps, and GitHub. Proven experience working with global teams and managing cross-functional projects. Excellent understanding of integration design principles and best practices. Strong leadership and communication skills. Ability to manage multiple projects and priorities simultaneously. Experience with integration tools and platforms such as API management, ESB, and ETL. Knowledge of security and regulatory requirements related to integration solutions. Strong problem-solving and analytical skills. Desired Experience and Skills Experience in managing large-scale integration projects. Familiarity with other cloud platforms and technologies. Knowledge of DevOps practices and tools. Experience with Agile methodologies. Certification in Azure or related technologies. Strong understanding of business processes and how they relate to integration solutions. Skyworks is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other characteristic protected by law.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle Netsuite to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: 1. Lead a team of NetSuite developers, providing guidance, mentorship, and technical expertise to ensure high-quality deliverables and project success. 2. Define technical architecture and design standards for NetSuite solutions, ensuring scalability, performance, and maintainability. 3. Stay updated on emerging technologies and best practices in NetSuite development, driving innovation and continuous improvement within the team. 4. Manage end-to-end technical projects for NetSuite implementations, upgrades, and customizations, ensuring adherence to scope, budget, and timeline. 5. Develop project plans, resource allocation strategies, and risk mitigation plans, and monitor project progress to identify and address issues proactively. 6. Lead the development and customization of NetSuite solutions, including Suite Scripting, Suite Flow, Suite Builder, and Suite Cloud development. 7. Collaborate with functional consultants to translate business requirements into technical solutions, ensuring alignment with best practices and industry standards. 8. Serve as a technical liaison between the development team and clients, providing technical expertise, addressing concerns, and managing expectations. 9. Participate in client meetings and workshops to understand their technical requirements, propose solutions, and provide updates on project status. o Mandatory Skill Sets: Netsuite *Preferred skill sets Netsuite Qualifications: 1. Bachelor’s degree in computer science, Information Technology, or related field. 2. 2years of hands-on experience in NetSuite development, customization, and integration. 3 Expertise in NetSuite Suite Script, Suite Flow, Suite Builder, and Suite Cloud development platforms. 5. NetSuite certifications such as Suite Foundation, Suite Cloud Developer, or Suite Commerce Advanced are highly desirable. *Years of experience required • Minimum 2Years of Netsuite expert *Education Qualification • Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Netsuite Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage, Relationship Building, Self-Awareness {+ 4 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 6 days ago

Apply

4.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

FasCave IT Solutions Pvt. Ltd. is looking for a Data Engineer to join our team on a 10-12 months contract basis for one of the best healthcare companies as a client. If you have expertise in data modeling and engineering, this opportunity is for you! Position: Data Engineer Location: Remote Duration: 10-12 Months (Contract) Experience: 4-10 Years Shift Time: Australian Shift (5 AM TO 1 PM IST) Key Requirements: Strong SQL skills Snowflake Azure Data Factory (ADF) Power BI SSIS (Nice to have) 📩 How to Apply? Send your resume to hrd@fascave.com with the following details: - Years of Experience - Current CTC - Expected CTC - Earliest Joining Date

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies