Jobs
Interviews

1630 Adf Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a passionate and experienced Oracle ERP Techno-Functional Consultant/Architect who will be responsible for driving the implementation and optimization of critical Oracle ERP solutions in financials, supply chain, and HCM. You will lead complex Oracle Cloud and EBS initiatives contributing to the digital transformation of a dynamic global enterprise. Your key responsibilities will include leading technical delivery for Oracle ERP implementations, designing and implementing scalable solutions across modules, developing integrations using Oracle Integration Cloud and various technologies, delivering reporting solutions, handling technical upgrades and data migrations, collaborating with stakeholders to translate requirements into technical designs, and managing customization and performance tuning. To excel in this role, you should have at least 5 years of hands-on experience with Oracle ERP, expertise in Oracle Cloud Integration, PL/SQL, SQL Tuning, BI/OTBI, ADF, VBCS, SOAP/REST APIs, Oracle Workflow & Personalization, and proven experience leading full-cycle ERP implementations. You should also possess a strong technical architecture background, stakeholder management skills, and familiarity with relevant tools. Additional qualifications such as Oracle Certifications, experience in industries like healthcare, manufacturing, or retail, exposure to Taleo integrations and HCM Extracts would be a bonus. By joining our team, you will be at the forefront of enterprise-wide ERP transformation initiatives, working with seasoned professionals and solution architects, leading global projects, mentoring teams, and benefiting from competitive salary, upskilling programs, and long-term growth opportunities. If you are ready to shape the future of ERP solutions and build smarter, faster, and future-ready systems in Bangalore, apply now and be part of our exciting journey.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

The Data Warehouse Engineer will be responsible for managing and optimizing data processes in an Azure environment using Snowflake. The ideal candidate should have solid SQL skills and a basic understanding of data modeling. Experience with CI/CD processes and Azure ADF is preferred. Additionally, expertise in ETL/ELT frameworks and ER/Studio would be a plus. As a Senior Data Warehouse Engineer, in addition to the core requirements, you will oversee other engineers while also being actively involved in data modeling and Snowflake SQL optimization. You will be responsible for conducting design reviews, code reviews, and deployment reviews with the engineering team. Familiarity with medallion architecture and experience in Healthcare or life sciences industry will be highly advantageous. At Myridius, we are committed to transforming the way businesses operate by offering tailored solutions in AI, data analytics, digital engineering, and cloud innovation. With over 50 years of expertise, we aim to drive organizations through the rapidly evolving landscapes of technology and business. Our integration of cutting-edge technology with deep domain knowledge enables businesses to seize new opportunities, drive significant growth, and maintain a competitive edge in the global market. We go beyond typical service delivery to craft transformative outcomes that help businesses not just adapt, but thrive in a world of continuous change. Discover how Myridius can elevate your business to new heights of innovation by visiting us at www.myridius.com and start leading the change.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You are a Data Engineer with 3-7 years of experience, currently based in Mumbai and available for face-to-face interaction. Your responsibilities will include building and managing data pipelines using Snowflake and Azure Data Factory (ADF), writing optimized SQL for large-scale data analysis, and monitoring and enhancing Snowflake performance. To excel in this role, you should have a strong background in data engineering and SQL with a focus on Snowflake and ADF. Additionally, familiarity with data quality, governance, and Python will be beneficial. Possessing a Snowflake certification will be considered a plus. If you meet these requirements and are passionate about working in a dynamic environment, where you can utilize your skills in Snowflake and SQL, we encourage you to apply for this position. Please send your CV to shruthi.pu@andortech.com to be considered for this opportunity. #Hiring #DataEngineer #Snowflake #AzureDataFactory #SQL #NowHiring #JobOpening,

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Thane, Maharashtra, India

On-site

Your responsibilities We are seeking a highly experienced Senior Data Engineer with expertise in Azure-based cloud architecture to join our team. In this role, you will design, build, and optimize complex data pipelines and cloud infrastructure to support data-driven business decisions. You’ll be responsible for implementing robust, scalable solutions leveraging Azure Synapse Analytics, Databricks, ADF, SQL with DevOps. The ideal candidate will also possess experience in Power BI, data mining, data analysis, and migration of on-premises systems to the cloud. Familiarity with Microsoft Fabric is an added advantage. Key Responsibilities Cloud Architecture & Infrastructure: Design and implement Azure-based cloud infrastructure, including data storage, processing, and analytics components. Develop and optimize scalable data architectures to ensure high performance and availability. Data Pipeline Development: Build and manage ETL/ELT pipelines using Azure Synapse, Azure Data Factory (ADF), and Databricks. Ensure the efficient flow of data from source to target systems, implementing robust data quality controls. Data Transformation & Analysis: Utilize SQL, Synapse, and Databricks for data transformation, data mining, and advanced data analysis. Implement best practices for data governance, lineage, and security within Azure environments. Migration Projects: Lead the migration of on-premises data systems to Azure cloud infrastructure, ensuring minimal disruption and data integrity. Optimize data migration strategies and methodologies for various applications and workloads. DevOps & CI/CD Pipelines: Manage DevOps processes, ensuring continuous integration and deployment (CI/CD) for data solutions. Develop and maintain infrastructure as code (IaC) for deployment, testing, and monitoring. Business Intelligence & Reporting: Collaborate with business stakeholders to design and implement reporting solutions in Power BI, ensuring data is accessible and actionable. Develop visualizations and dashboards to support data-driven decision-making. Collaboration & Best Practices: Work closely with data scientists, analysts, and other business stakeholders to understand requirements and provide optimized data solutions. Drive best practices in data engineering, including coding standards, testing, version control, and documentation. Your profile Candidate must have 3 - 6 years of experience Education: Bachelor’s or master’s degree in computer science, Information Technology, Data Engineering, or related field. Technical Expertise: Azure Cloud: Advanced proficiency in Azure services, including Synapse Analytics, Data Factory, Databricks, SQL, Blob Storage, and Data Lake. Data Engineering: Strong skills in SQL, ETL/ELT processes, data warehousing, and data modeling. DevOps: Experience with CI/CD pipeline setup, automation, and Azure DevOps. Data Analysis & BI: Proficient in data analysis and visualization using Power BI; experience in data mining techniques is desirable. Migration Experience: Proven track record of migrating on-premises systems to Azure cloud. Additional Skills: Knowledge of Microsoft Fabric is a plus. Familiarity with Infrastructure as Code (IaC) tools like ARM templates or Terraform. Strong understanding of data governance, security, and compliance best practices. Soft Skills: Strong problem-solving and analytical skills. Excellent communication skills, with the ability to collaborate effectively across teams. Ability to manage multiple priorities and work independently in a dynamic environment. Work location: Thane (Mumbai) Your benefits Company Home - thyssenkrupp Materials Services (thyssenkrupp-materials-services.com) Contact Vinit Poojary - tkmits-in-recruitment@thyssenkrupp-materials.com

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Vadodara

Work from Office

We are seeking an experienced Senior Data Engineer with minimum 5 years of hands-on experience to join our dynamic data team. The ideal candidate will have strong expertise in Microsoft Fabric, demonstrate readiness to adopt cutting-edge tools like SAP Data Sphere, and possess foundational AI knowledge to guide our data engineering initiatives. Key Roles and Responsibilities: Design, develop, and maintain scalable data pipelines and ETL/ELT processes using Microsoft Fabric tools such as Azure Data Factory (ADF) and Power BI. Work on large-scale data processing and analytics using PySpark. Evaluate and implement new data engineering tools like SAP Data Sphere through training or self-learning. Support business intelligence, analytics, and AI/ML initiatives by building robust data architectures. Apply AI techniques to automate workflows and collaborate with data scientists on machine learning projects. Mentor junior data engineers and lead data-related projects across departments. Coordinate with business teams, vendors, and technology partners for smooth project delivery. Create dashboards and reports using tools like Power BI or Tableau, ensuring data accuracy and accessibility. Support self-service analytics across business units and maintain consistency in all visualizations. Experience & Technical Skills 5+ years of professional experience in data engineering with expertise in Microsoft Fabric components Strong proficiency in PySpark for large-scale data processing and distributed computing (MANDATORY) Extensive experience with Azure Data Factory (ADF) for orchestrating complex data workflows (MANDATORY) Proficiency in SQL and Python for data processing and pipeline development Strong understanding of cloud data platforms, preferably Azure ecosystem Experience in data modelling , data warehousing , and modern data architecture patterns Interested candidates can share their updated profiles at "itcv@alembic.co.in"

Posted 2 weeks ago

Apply

3.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Databricks Developer Location: Pune, Maharashtra (Hybrid – 2 to 3 days from office) Experience: 3 to 10 Years Employment Type: Full-Time / Contract Notice Period: Immediate to 30 Days Preferred About the Role: We are seeking a highly skilled and motivated Databricks Developer to join our dynamic data engineering team in Chennai. As a Databricks expert, you will be responsible for designing, developing, and maintaining robust big data pipelines and solutions using Databricks, Spark, and other modern data technologies. Key Responsibilities: Design and develop scalable data pipelines using Databricks , Apache Spark , and Delta Lake . Implement ETL/ELT workflows for processing large volumes of structured and unstructured data. Collaborate with data scientists, analysts, and stakeholders to define data models and deliver business insights. Optimize queries and performance on big data platforms. Integrate data from various sources including Azure Data Lake , SQL , and NoSQL systems. Build reusable code and libraries for future use in the data pipeline framework. Maintain and enhance data governance, quality, and security standards. Troubleshoot and resolve technical issues and support production pipelines. Required Skills & Experience: 3 to 10 years of experience in data engineering or big data development . Strong hands-on experience in Databricks and Apache Spark (PySpark preferred). Proficient in Python and/or Scala for data processing. Solid understanding of data warehousing concepts , data lakes , and data lakehouse architecture . Experience working with Azure Cloud services (ADF, ADLS, Synapse) or AWS Glue/EMR is a plus. Strong experience in SQL and performance tuning of queries. Experience in CI/CD integration and version control (e.g., Git, Azure DevOps). Good understanding of Delta Lake , MLFlow , and Notebooks . Nice to Have: Databricks certification (Developer or Data Engineer Associate/Professional). Knowledge of streaming frameworks like Kafka or Structured Streaming. Exposure to Airflow , Azure Data Factory , or similar orchestration tools.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Telangana

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Role : MLops Engineer Engineer Experience : 5-10 Years Mandatory Skill: Python/MLOps/Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description: Other Skills: Azure/LLMOps/ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers, software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team. · Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers tbd

Posted 2 weeks ago

Apply

2.0 - 5.0 years

6 - 9 Lacs

Hyderābād

On-site

Data Engineer, DT US PxE The Data Engineer is an integral part of the technical application development team and primarily responsible for analyze, plan, design, develop, and implement the Azure Data engineering solutions to meet strategic, usability, performance, reliability, control, and security requirements of Data science processes. Requires demonstrable knowledge in areas of Data engineering, AI/ML, Data warehouse and reporting applications. Must be innovative. Work you will do A unique opportunity to be a part of growing team that works on a premier unified data science analytics platform within Deloitte. You will be responsible for implementing/delivering/supporting Data engineering and AI/ML solutions to support the Deloitte US Member Firm. Outcome-Driven Accountability Collaborate with business and IT leaders to develop and refine ideas for integrating predictive and prescriptive analytics within business processes, ensuring measurable customer and business outcomes. Decompose complex business problems into manageable components, facilitating the use of multiple analytic modeling methods for holistic and valuable solutions. Develop and refine prototypes and proofs of concepts, presenting results to business and IT leaders, and demonstrating the impact on customer needs and business outcomes. Technical Leadership and Advocacy Engage in data analysis, generating and testing hypotheses, preparing and analyzing historical data, identifying patterns, and applying statistical methods to formulate solutions that deliver high-quality outcomes. Develop project plans, including resource needs and task dependencies, to meet project deliverables with a focus on incremental and iterative delivery. Engineering Craftsmanship Participate in defining project scope, objectives, and quality controls for new projects, ensuring alignment with customer-centric engineering principles. Present and communicate project deliverable results, emphasizing the value delivered to customers and the business. Customer-Centric Engineering Assist in recruiting and mentoring team members, fostering a culture of engineering craftsmanship and continuous learning. Incremental and Iterative Delivery Stay abreast of changes in technology, leading new technology evaluations for predictive and statistical analytics, and advocating for innovative, lean, and feasible solutions. Education: Bachelor’s degree in Computer Science or Business Information Systems or MCA or equivalent degree. Qualifications: 2 to 5 years Advanced Level of experience in Azure Data engineering Expertise in Development, deployment and monitoring ADF pipelines (using visual studio and browsers) Expertise in Azure databricks internal programming using (PySpark, SparkR and SparkSQL) or Amazon EMR (Elastic MapReduce). Expertise in managing azure storage (Azure Datalake Gen2, Azure Blob Storage, Azure SQL database) or Azure Blob Storage, Azure Data Lake Storage, Azure Synapse Analytics, Azure Data Factory Advanced programming skills in Python, R and SQL (SQL for HANA, MS SQL) Hands on experience in Visualization tools (Tableau / PowerBI) Hands on experience in Data science studios like (Dataiku, Azure ML studio, Amazon SageMaker) The Team Information Technology Services (ITS) helps power Deloitte’s success. ITS drives Deloitte, which serves many of the world’s largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. The ~3,000 professionals in ITS deliver services including: Security, risk & compliance Technology support Infrastructure Applications Relationship management Strategy Deployment PMO Financials Communications Product Engineering (PxE) Product Engineering (PxE) is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team-based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world-class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/engagement requirements and at the discretion of the management. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302720

Posted 2 weeks ago

Apply

4.0 years

20 Lacs

Hyderābād

On-site

Job Description : We are seeking a skilled and dynamic Azure Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building and maintaining data pipelines and working with large datasets on the Azure cloud platform. The Azure Data Engineer will be responsible for developing and implementing efficient ETL processes, working with data warehouses, and leveraging cloud technologies such as Azure Data Factory (ADF), Azure Databricks, PySpark, and SQL to process and transform data for analytical purposes. Key Responsibilities : - Data Pipeline Development : Design, develop, and implement scalable, reliable, and high-performance data pipelines using Azure Data Factory (ADF), Azure Databricks, and PySpark. - Data Processing : Develop complex data transformations, aggregations, and cleansing processes using PySpark and Databricks for big data workloads. - Data Integration : Integrate and process data from various sources such as databases, APIs, cloud storage (e.g., Blob Storage, Data Lake), and third-party services into Azure Data Services. - Optimization : Optimize data workflows and ETL processes to ensure efficient data loading, transformation, and retrieval while ensuring data integrity and high performance. - SQL Development : Write complex SQL queries for data extraction, aggregation, and transformation. Maintain and optimize relational databases and data warehouses. - Collaboration : Work closely with data scientists, analysts, and other engineering teams to understand data requirements and design solutions that meet business and analytical needs. - Automation & Monitoring : Implement automation for data pipeline deployment and ensure monitoring, logging, and alerting mechanisms are in place for pipeline health. - Cloud Infrastructure Management : Work with cloud technologies (e.g., Azure Data Lake, Blob Storage) to store, manage, and process large datasets. - Documentation & Best Practices : Maintain thorough documentation of data pipelines, workflows, and best practices for data engineering solutions. Job Type: Full-time Pay: Up to ₹2,000,000.00 per year Experience: Azure: 4 years (Required) Python: 4 years (Required) SQL: 4 years (Required) Work Location: In person

Posted 2 weeks ago

Apply

3.0 years

3 - 6 Lacs

Gurgaon

On-site

#freepost Designation: Middleware Administrator Experience: 3+ Years Qualification: B.E. / B. Tech/BCA Location: Gurugram, Haryana Roles and Responsibility · Monitor application response times from the end-user perspective in real time and alert organizations when performance is unacceptable. By alerting the user to problems and intelligently segmenting response times, it should quickly expose problem sources and minimizes the time necessary for resolution. · It should allow specific application transactions to be captured and monitored separately. This allows administrators to select the most important operations within business-critical applications to be measured and tracked individually. · It should use baseline-oriented thresholds to raise alerts when application response times deviate from acceptable levels. This allows IT administrators to respond quickly to problems and minimize the impact on service delivery. · Shutdown and start-up of applications, generation of MIS reports, monitoring of application load user account management scripts execution, analysing system events, monitoring of error logs etc · Monitoring of applications, including Oracle Forms 10g ,Oracle SSO 10g ,OID 10g, Oracle Portal 10g ,Oracle Reports 10g ,Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3 ,Oracle Access Manager 12.2.1.3,Oracle Internet Directory 12.2.1.3,Oracle WebLogic Server 12.2.1.3,Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware) ,Oracle Forms 12.2.1.3,Oracle Reports12.2.1.3,mobile apps, Windows IIS, portal, web cache, BizTalk application and DNS applications, tomcat etc. Job Type: Full-time Benefits: Health insurance Provident Fund Work Location: In person

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Job Title : Senior DotNet Developer Experience: 8+ Years Job Type: Contract Contract Duration: 6 months Location : Remote Budget: 1L per month Working Hours:- 12:00 PM to 09:00 PM. DETAILS JOB DESCRIPTION Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours RESPONSIBILITIES •Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery. •Integrate and support third-party APIs and external services. •Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack. •Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) •Participate in Agile/Scrum ceremonies and manage tasks using Jira. •Understand technical priorities, architectural dependencies, risks, and implementation challenges. •Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability. PRIMARY SKILLS 8+ years of hands-on development experience with: ☑ C#, .NET Core 6/8+, Entity Framework / EF Core ☑ JavaScript, jQuery, REST APIs ☑ Expertise in MS SQL Server, including: ☑ Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types ☑ Skilled in unit testing with XUnit, MSTest ☑ Strong in software design patterns, system architecture, and scalable solution design ☑ Ability to lead and inspire teams through clear communication, technical mentorship, and ownership ☑ Strong problem-solving and debugging capabilities ☑ Ability to write reusable, testable, and efficient code ☑ Develop and maintain frameworks and shared libraries to support large-scale applications ☑ Excellent technical documentation, communication, and leadership skills ☑ Microservices and Service-Oriented Architecture (SOA) ☑ Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: ☑Azure Functions ☑Azure Durable Functions ☑Azure Service Bus, Event Grid, Storage Queues ☑Blob Storage, Azure Key Vault, SQL Azure ☑Application Insights, Azure Monitoring SECONDARY SKILLS ( GOOD TO HAVE) ☑Familiarity with AngularJS, ReactJS, and other front-end frameworks ☑Experience with Azure API Management (APIM) ☑Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) ☑Experience with Azure Data Factory (ADF) and Logic Apps ☑Exposure to Application Support and operational monitoring ☑Azure DevOps - CI/CD pipelines (Classic / YAML) CERTIFICATIONS REQUIRED (IF ANY) ☑Microsoft Certified: Azure Fundamentals ☑Microsoft Certified: Azure Developer Associate ☑Other relevant certifications in Azure, .NET, or Cloud technologies

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Job Role Senior Dot Net Developer Experience: 8+ years Max Notice period: Immediate Location: Trivandrum / Kochi Introduction: Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: • Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery • Integrate and support third-party APIs and external services • Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack • Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) • Participate in Agile/Scrum ceremonies and manage tasks using Jira • Understand technical priorities, architectural dependencies, risks, and implementation challenges • Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability • Certifications : • Microsoft Certified: Azure Fundamentals • Microsoft Certified: Azure Developer Associate • Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: • C#, .NET Core 6/8+, Entity Framework / EF Core • JavaScript, jQuery, REST APIs • Expertise in MS SQL Server, including: • Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types • Skilled in unit testing with XUnit, MSTest • Strong in software design patterns, system architecture, and scalable solution design • Ability to lead and inspire teams through clear communication, technical mentorship, and ownership • Strong problem-solving and debugging capabilities • Ability to write reusable, testable, and efficient code • Develop and maintain frameworks and shared libraries to support large-scale applications • Excellent technical documentation, communication, and leadership skills • Microservices and Service-Oriented Architecture (SOA) • Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: • Azure Functions • Azure Durable Functions • Azure Service Bus, Event Grid, Storage Queues • Blob Storage, Azure Key Vault, SQL Azure • Application Insights, Azure Monitoring Secondary Skills: • Familiarity with AngularJS, ReactJS, and other front-end frameworks • Experience with Azure API Management (APIM) • Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps • Exposure to Application Support and operational monitoring • Azure DevOps - CI/CD pipelines (Classic / YAML)

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Ready to embark on a journey where your growth is intertwined with our commitment to making a positive impact? Join the Delphi family - where Growth Meets Values. At Delphi Consulting Pvt. Ltd. , we foster a thriving environment with a hybrid work model that lets you prioritize what matters most. Interviews and onboarding are conducted virtually, reflecting our digital-first mindset . We specialize in Data, Advanced Analytics, AI, Infrastructure, Cloud Security , and Application Modernization , delivering impactful solutions that drive smarter, efficient futures for our clients. About the Role: We are looking for a Lead Consultant – Data Functional (Healthcare) to join our growing Data & AI team. The ideal candidate will bring deep functional expertise in Cerner and Dynamics 365, with a proven track record in healthcare data management, integration, analytics, and stakeholder engagement. This is a client-facing role where you will lead the design and implementation of data solutions, ensuring compliance with healthcare regulations, and translating complex business needs into scalable, efficient data architectures. You will work closely with both internal teams and external healthcare clients across the data lifecycle— from requirement gathering and ETL design to insights generation and reporting. What you'll do: Collaborate with stakeholders to gather business requirements and convert them into functional data design documents. Lead data integration efforts across Cerner, Dynamics 365, and other healthcare platforms like Epic. Perform and lead functional data activities, including: Source-to-target mapping Master data management Data validation Documentation Design and validate scalable data architectures in collaboration with engineering teams using: Azure Data Factory Databricks Synapse Analytics SQL Server Conduct data analysis and generate insights for healthcare-specific use cases such as: Clinical operations Patient engagement Safety Develop functional assets such as: Data dictionaries Mapping sheets Validation checklists Test cases Ensure adherence to healthcare data compliance standards like HIPAA. Build dashboards and reports using Cerner, Dynamics 365, and other sources. Exposure to Power BI or Microsoft Fabric is an added advantage. Work closely with clients to identify gaps, propose solutions, and ensure successful implementation of data initiatives. Create artifacts and collaterals for healthcare proposals, and play a key role in establishing a Healthcare Center of Excellence (CoE) from the ground up. Contribute to pre-sales activities, including: RFIs/RFPs Solution design Client presentations What you'll do: • Domain expertise across Clinical Operations, Patient Safety, and Patient Experience. • Extensive functional knowledge and hands-on experience with Cerner, Dynamics 365, and exposure to Epic. • Proven experience in ETL design, data mapping, validation, and master data management. • Proficiency with Azure Data Services: ADF, Synapse, Databricks, SQL Server. • Familiarity with healthcare standards, workflows, and regulatory compliance. • Demonstrated experience in building functional documentation and reusable artifacts. • Strong client engagement skills; ability to translate requirements into data design and ensure end-toend delivery. • Experience contributing to business proposals and setting up data/healthcare-focused CoEs. • Good to have: Working knowledge of Power BI, Microsoft Fabric, or other Microsoft data tools. What We Offer: At Delphi, we are dedicated to creating an environment where you can thrive, both professionally and personally. Our competitive compensation package, performance-based incentives, and health benefits are designed to ensure you're well-supported. We believe in your continuous growth and offer company sponsored certifications, training programs, and skill-building opportunities to help you succeed. We foster a culture of inclusivity and support, with remote work options and a fully supported work-from home setup to ensure your comfort and productivity. Our positive and inclusive culture includes team activities, wellness and mental health programs to ensure you feel supported

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Thiruvananthapuram Taluk, India

On-site

DETAILS JOB DESCRIPTION Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours RESPONSIBILITIES •Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery. •Integrate and support third-party APIs and external services. •Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack. •Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) •Participate in Agile/Scrum ceremonies and manage tasks using Jira. •Understand technical priorities, architectural dependencies, risks, and implementation challenges. •Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability. PRIMARY SKILLS 8+ years of hands-on development experience with: ☑ C#, .NET Core 6/8+, Entity Framework / EF Core ☑ JavaScript, jQuery, REST APIs ☑ Expertise in MS SQL Server, including: ☑ Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types ☑ Skilled in unit testing with XUnit, MSTest ☑ Strong in software design patterns, system architecture, and scalable solution design ☑ Ability to lead and inspire teams through clear communication, technical mentorship, and ownership ☑ Strong problem-solving and debugging capabilities ☑ Ability to write reusable, testable, and efficient code ☑ Develop and maintain frameworks and shared libraries to support large-scale applications ☑ Excellent technical documentation, communication, and leadership skills ☑ Microservices and Service-Oriented Architecture (SOA) ☑ Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: ☑Azure Functions ☑Azure Durable Functions ☑Azure Service Bus, Event Grid, Storage Queues ☑Blob Storage, Azure Key Vault, SQL Azure ☑Application Insights, Azure Monitoring SECONDARY SKILLS ( GOOD TO HAVE) ☑Familiarity with AngularJS, ReactJS, and other front-end frameworks ☑Experience with Azure API Management (APIM) ☑Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) ☑Experience with Azure Data Factory (ADF) and Logic Apps ☑Exposure to Application Support and operational monitoring ☑Azure DevOps - CI/CD pipelines (Classic / YAML)

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

****Read the JD Carefully and fill the below foam and send resume following mail id **** Role: Azure DevOps Engineer with Databricks Primary Skills: Azure DevOps , CI/CD, Databricks Note: Must have Integrated Applications through CI/CD using any Data related . Experience: 5-10years Location: Hyderabad, Pune, Bangalore . Mode of Hire: Contractor/ Permanent (Full Time) Please find the JD Mentioned . 5+ years in DevOps with strong data pipeline experience The requirement is for Azure DevOps with Databricks , specifically using CI/CD pipelines for data implementation into data tools . Build and maintain CI/CD pipelines for Azure Data Factory and Databricks notebooks The role demands deep expertise in Databricks, including the automation of unit, integration, and QA testing workflows. Additionally, strong data architecture skills are essential, as the position involves implementing CI/CD pipelines for schema updates. Strong experience with Azure DevOps Pipelines, YAML builds, and release workflows. Proficiency in scripting languages like Python, PowerShell, Terraform Working knowledge of Azure services: ADF, Databricks, DABs, ADLS Gen2, Key Vault, ADO. Maintain infrastructure-as-code practices Collaborate with Data Engineers and Platform teams to maintain development, staging, and production environments. Monitor and troubleshoot pipeline failures and deployment inconsistencies. Fill this foam Also - https://docs.google.com/forms/d/e/1FAIpQLSfU0Jf2DmetEIfXLjrVm5pbZRtBNazIiSDJbXwB6BlN0uWHhw/viewform?usp=header Share Your Resume to this mail - aman.tyagi@firstwave-tech.com

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Company Description UNIFY Dots is a global technology and software solutions company specializing in Microsoft Dynamics 365-based solutions. We are seeking a Business Intelligence Technical Consultant who has experience in designing and implementing end to end business intelligence solutions using Microsoft Power BI & Azure Synapse Analytics. The job will be work from home. This is a full-time position. Responsibilities Job Description Your job duties include the following tasks and responsibilities: Understanding and documenting reporting, business intelligence and dashboard requirements. Performing Data Mapping between Source systems like Microsoft Dynamics 365 Supply Chain and Finance ERP (“D365”), CRM, Dataverse, Fabric, Azure Synapse, ADLS and Power BI. Developing ADF/Synapse/Fabric Pipelines for ETL/data warehousing Performing data ingestion Configuring and Develop Power BI Reports to generate standardized and on-demand executive dashboards, reports, metrics & KPIs. Enhancing and modifying existing Power BI Reports, Dashboards and workspaces including embedded Power BI reports in D365. Analyzing data discrepancies and performing root cause analysis for reports showing different data than expected in the Production Environment. Writing Business Intelligence and analytics technical blog articles per year on the Unify Dots blog. Qualifications Bachelor’s Degree 2-4 years of experience using Power BI and at least one year of experience with Azure Synapse/Fabric Analytics Microsoft Certification - Power BI Data Analyst Associate (PL-300) Conversational skills in English language Experience with Power BI Desktop, Power BI Report Builder, Power BI Service, DAX, SQL, ETL, Azure DevOps Exposure to Azure Data Factory Additional Information Benefits Market competitive compensation. Medical Insurance for Team member + Spouse + Children + Parents. Flexibility to Work from Home for the majority of the time Laptop for Work from Home while working at Unify Dots People before Profit Culture that values team members over financial numbers.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Good day, We have immediate opportunity for Azure Data Engineer. Job Role: Azure Data Engineer Job Location: Kharadi , Pune Experience- 6 Years - 12 Years Notice Period: Immediate to 30 days. About Company: At Synechron, we believe in the power of digital to transform businesses for the better. Our global consulting firm combines creativity and innovative technology to deliver industry-leading digital solutions. Synechron’s progressive technologies and optimization strategies span end-to-end Artificial Intelligence, Consulting, Digital, Cloud & DevOps, Data, and Software Engineering, servicing an array of noteworthy financial services and technology firms. Through research and development initiatives in our FinLabs we develop solutions for modernization, from Artificial Intelligence and Blockchain to Data Science models, Digital Underwriting, mobile-first applications and more. Over the last 20+ years, our company has been honoured with multiple employer awards, recognizing our commitment to our talented teams. With top clients to boast about, Synechron has a global workforce of 14,700+ and has 55 offices in 20 countries within key global markets. For more information on the company, please visit our website or LinkedIn community. Diversity, Equity, and Inclusion: Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and an affirmative-action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law. Job Description: As an Azure Data Engineer, you will be responsible for designing, implementing, and maintaining data management systems on the Azure cloud platform. You will be responsible for creating and handling scalable data pipelines, assuring data quality, and maximizing data processing performance You will also work closely with other teams, such as data analysts, data scientists, and software developers, to provide them with the data they need to perform their job functions. We are looking for candidates with 8+ years of overall experience and a minimum of 4+ years’ experience in Azure. Technical Skills (Must Have): Azure Databricks, Spark, ADF (Azure Data Factory) Optional Skills (Good to have) : Spark, structure Streaming, SQL, GITLAB Responsibilities : Designing and implementing data storage solutions on Azure Building and maintaining data pipelines for data integration and processing Ensuring data quality and accuracy through testing and validation Developing and maintaining data models and schemas Collaborating with other teams to provide data for analytics and reporting. ensuring data security and privacy prerequisites are followed. Primary Skills : We will require professional with a career reflecting technical abilities coupled with “hands-on” experience in a diverse range of software projects: Strong exposure on DataBricks, Azure Data Factory, ADLS Strong exposure to Spark and structure streaming. Exposure on Cloud Integration and container services (Azure) Oracle and MS-SQL experience, terraform will be an asset. Expertise in managing Repository (GITLAB). Clean coding and refactoring skills and Test-Driven-Development (TDD). Performance optimization and scalability. Know-how of Agile Development practices (Scrum, XP, Kanban, etc.) Adaptable, able to work across teams, functions, applications. Enthusiastic, self-motivated and client focused. Prior Financial/ Banking experience is desirable. Secondary skills : Familiarity with data processing frameworks such as Apache Spark and Hadoop Understanding of data modeling and schema design principles Ability to work with large datasets and perform data analysis. Strong problem-solving and troubleshooting skills. If you find this opportunity interesting kindly share your below details (Mandatory) Total Experience Experience in Azure - Experience in Power BI – Experience in Databricks- Current CTC- Expected CTC- Notice period- Current Location- If you had gone through any interviews in Synechron before? If Yes when Regards, Recruitment Team, Pune

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Telangana, India

On-site

About Chubb JOB DESCRIPTION Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Role : MLops Engineer Engineer Experience : 5-10 Years Mandatory Skill: Python/MLOps/Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description Other Skills: Azure/LLMOps/ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers, software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers Qualifications tbd

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Telangana, India

On-site

About Chubb JOB DESCRIPTION Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Role : MLops Engineer Engineer Experience : 5-10 Years Mandatory Skill: Python/MLOps/Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description Other Skills: Azure/LLMOps/ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers, software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Year of experience required • Minimum 4Years of Oracle fusion experience • *Education Qualification : Any Graduate or Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master Degree, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on Integration (OIC) and PLSQL *Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC)+ Plsql *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Years of experience required • Minimum 2+ Years of Oracle fusion experience *Education Qualification • BE/BTech • MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Location HYDERABAD OFFICE INDIA Job Description Are you looking to take your career to the next level? We’re looking for a DevOps Engineer to join our Data & Analytics Core Data Lake Platform engineering team. We are searching for self-motivated candidates, who will leverage modern Agile and DevOps practices to design, develop, test and deploy IT systems and applications, delivering global projects in multinational teams. P&G Core Data Lake Platform is a central component of P&G data and analytics ecosystem. CDL Platform is used to deliver a broad scope of digital products and frameworks used by data engineers and business analysts. In this role you will have an opportunity to leverage data engineering skillset to deliver solutions enriching data cataloging and data discoverability for our users. With our approach to building solutions that would fit the scale P&G business is operating, we combine data engineering best practices (Databricks) with modern software engineering standards (Azure, DevOps, SRE) to deliver value for P&G. RESPONSIBILITIES: Writing and testing code for Data & Analytics platform applications and building E2E cloud native (Azure) solutions. Engineering applications throughout its entire lifecycle from development, deployment, upgrade, and replacement/termination Ensuring that development and architecture enforce to established standards, including modern software engineering practices (CICD, Agile, DevOps) Collaborate with internal technical specialists and vendors to develop final product to improve overall performance, efficiency and/or to enable adaptation of new business process. Qualifications Job Qualifications Bachelor’s degree in computer science or related technical field. 8+ years of experience working as Software/Data Engineer (with focus on developing in Python, PySpark, Databricks, ADF) Experience leveraging modern software engineering practices (code standards, Gitflow, automated testing, CICD, DevOps) Experience working with Cloud infrastructure (Azure preferred) Strong verbal, written, and interpersonal communication skills. A strong desire to produce high quality software through cross functional collaboration, testing, code reviews, and other best practices. YOU ALSO SHOULD HAVE: Strong written and verbal English communication skills to influence others Demonstrated use of data and tools Ability to handle multiple priorities Ability to work collaboratively across different functions and geographies Job Schedule Full time Job Number R000134774 Job Segmentation Experienced Professionals (Job Segmentation)

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies