Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
15 - 24 Lacs
Bengaluru
On-site
Job Title: Senior Data Engineer – Azure | ADF | Databricks | PySpark | AWS Location: Bangalore, Hyderabad, Chennai (Hybrid Mode) Experience Required: 5+ Years Notice Period: Immediate Job Description We are looking for a Senior Data Engineer who is passionate about designing and developing scalable data pipelines, optimizing data architecture, and working with advanced big data tools and cloud platforms. This is a great opportunity to be a key player in transforming data into meaningful insights by leveraging modern data engineering practices on Azure, AWS, and Databricks . You will be working with cross-functional teams including data scientists, analysts, and software engineers to deliver robust data solutions. The ideal candidate will be technically strong in Azure Data Factory, PySpark, Databricks, and AWS services , and will have experience in building end-to-end ETL workflows and driving business impact through data. Key Responsibilities Design, build, and maintain scalable and reliable data pipelines and ETL workflows Implement data ingestion and transformation using Azure Data Factory (ADF) and Azure Databricks (PySpark) Work across multiple data platforms including Azure, AWS, Snowflake, and Redshift Collaborate with data scientists and business teams to understand data needs and deliver solutions Optimize data storage, processing, and retrieval for performance and cost-effectiveness Develop data quality checks and monitoring frameworks for pipeline health Ensure data governance, security, and compliance with industry standards Lead code reviews, set data engineering standards, and mentor junior team members Propose and evaluate new tools and technologies for continuous improvement Must-Have Skills Strong programming skills in Python , SQL , or Scala Azure Data Factory , Azure Databricks , Synapse Analytics Hands-on with PySpark , Spark, Hadoop, Hive Experience with cloud platforms (Azure preferred; AWS/GCP acceptable) Data Warehousing: Snowflake , Redshift , BigQuery Strong ETL/ELT pipeline development experience Workflow orchestration tools such as Airflow , Prefect , or Luigi Excellent problem-solving, debugging, and communication skills Nice to Have Experience with real-time streaming tools (Kafka, Flink, Spark Streaming) Exposure to data governance tools and regulations (GDPR, HIPAA) Familiarity with ML model integration into data pipelines Containerization and CI/CD exposure: Docker, Git, Kubernetes (basic) Experience with Vector databases and unstructured data handling Technical Environment Programming: Python, Scala, SQL Big Data Tools: Spark, Hadoop, Hive Cloud Platforms: Azure (ADF, Databricks, Synapse), AWS (S3, Glue, Lambda), GCP Data Warehousing: Snowflake, Redshift, BigQuery Databases: PostgreSQL, MySQL, MongoDB, Cassandra Orchestration: Apache Airflow, Prefect, Luigi Tools: Git, Docker, Azure DevOps, CI/CD pipelines Soft Skills Strong analytical thinking and problem-solving abilities Excellent verbal and written communication Collaborative team player with leadership qualities Self-motivated, organized, and able to manage multiple projects Education & Certifications Bachelor’s or Master’s Degree in Computer Science, IT, Engineering, or equivalent Cloud certifications (e.g., Microsoft Azure Data Engineer, AWS Big Data) are a plus Key Result Areas (KRAs) Timely delivery of high-performance data pipelines Quality of data integration and governance compliance Business team satisfaction and data readiness Proactive optimization of data processing workloads Key Performance Indicators (KPIs) Pipeline uptime and performance metrics Reduction in overall data latency Zero critical issues in production post-release Stakeholder satisfaction score Number of successful integrations and migrations Job Types: Full-time, Permanent Pay: ₹1,559,694.89 - ₹2,441,151.11 per year Benefits: Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Application Question(s): What is your notice period in days? Experience: Azure Data Factory, Azure Databricks, Synapse Analytics: 5 years (Required) Python, SQL, or Scala: 5 years (Required) Work Location: In person
Posted 1 month ago
7.0 - 12.0 years
20 - 27 Lacs
Bengaluru
Work from Office
TECHNICAL SKILLS AND EXPERIENCE Most important: 7+ years professional experience as a data engineer, with at least 4 utilizing cloud technologies. Proven experience building ETL or ETL data pipelines with Databricks either in Azure or AWS using PySpark language. Strong experience with the Microsoft Azure Data Stack (Databricks, Data Lake Gen2, ADF etc.) Strong SQL skills and proficiency in Python adhering to standards such as PEP Proven experience with unit testing and applying appropriate testing methodologies using libraries such as Pytest, Great Expectations, or similar. Demonstrable experience with CICD including release and test automation tools and processes such as Azure Devops, Terraform, Powershell and Bash scripting or similar. Strong understanding of data modeling, data warehousing, and OLAP concepts. Excellent technical documentation skills. Preferred candidate profile
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Tech skills Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL Strong background in distributed data processing and storage (e.g. Apache Spark, Hadoop) Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines Development experience with at least one cloud (Azure high preference, AWS, GCP) Knowledge of data lake and data lake house patterns Knowledge of ETL performance tuning and cost optimization Knowledge of data structures and algorithms and good software engineering practices Soft skills Strong communication skills to articulate complex situation concisely Comfortable with picking up new technologies independently Eye for detail, good data intuition, and a passion for data quality Comfortable working in a rapidly changing environment with ambiguous requirements Skills Python,Sql,Aws,Azure
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Location: Hyderabad, Indore and Ahmedabad (India) What You Will Do: Following are high level responsibilities that you will play but not limited to: · Develop and maintain data pipelines using Azure Data Factory (ADF) /Databricks for data integration and ETL processes. · Design, implement, and optimize Power BI /Fabric reports and dashboards to deliver actionable business insights. · Collaborate with data analysts, business users, and other teams to understand data requirements and deliver solutions using ADF and Power BI. · Extract, transform, and load (ETL) data from various sources into cloud-based storage systems such as Azure Data Lake or Azure SQL Database. · Work with large datasets and optimize queries and pipelines for performance and scalability. · Ensure data quality, integrity, and availability throughout the data lifecycle. · Automate repetitive data tasks, ensuring timely and accurate reporting. · Monitor and troubleshoot data pipelines, addressing any performance or data issues promptly. · Support data visualization and reporting tools, including Power BI, to enable business stakeholders to make data-driven decisions. · Write clear, efficient, and maintainable code for data transformations and automation. Required Qualifications: · Bachelor's degree in computer science, Information Technology, Engineering, or a related field. · 8+ years of hands-on experience in Data Engineering, BI Developer or a similar role. · Proficiency with Azure Data Factory (ADF), including the creation of data pipelines and managing data flows. · Strong experience in Power BI, including report creation, dashboard development, and data modeling. · Experience with SQL and database management (e.g., Azure SQL Database, SQL Server). · Knowledge of cloud platforms, especially Microsoft Azure. · Familiarity with data warehousing concepts and ETL processes. · Experience working with cloud-based data storage solutions (e.g., Azure Data Lake, Azure Blob Storage). · Strong programming skills in languages such as Python, SQL, or other relevant languages. · Ability to troubleshoot and optimize data pipelines for performance and reliability. Preferred Qualifications: · Familiarity with data modeling techniques and practices for Power BI. · Knowledge of Azure Databricks or other data processing frameworks. · Knowledge of Microsoft Fabric or other Cloud Platforms. What we need? · B. Tech computer science or equivalent. Why join us? · Work with a passionate and innovative team in a fast-paced, growth-oriented environment. · Gain hands-on experience in content marketing with exposure to real-world projects. · Opportunity to learn from experienced professionals and enhance your marketing skills. · Contribute to exciting initiatives and make an impact from day one. · Competitive stipend and potential for growth within the company. · Recognized for excellence in data and AI solutions with industry awards and accolades.
Posted 1 month ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Role -: Azure Integration Engineer Exp -: 5 to 8 yrs Location -: Indore Immediate Joiners preferred Must Have-: Proficiency in Azure Logic Apps, Azure API Management, Azure Service Bus, Azure Event Grid, ADF, C#.NET and Azure Functions. Experience with JSON, XML, and other data format Working experience with Azure DevOps and GitHub Knowledge of Integration monitoring and lifecycle management Roles & Responsibilities -: Designing, developing, and deploying integration workflows using Azure Logic Apps. Creating and managing APIs using Azure API Management. Developing event-driven solutions with Azure Event Grid and Azure Service Bus. Building serverless functions with Azure Functions to support integration logic. Developing data transformations and mappings. Implementing integration patterns such as API integration, message queuing, and event-driven architecture. Working with different data formats (e.g., JSON, XML) & protocols (SOAP, REST etc) Perform UT & Help with integration testing. Support UAT
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Greetings from TCS!!! TCS is hiring for Azure Data Engineer Exp: 8-10 years Location: Kolkata/Pune/Mumbai/Bangalore Must-Have Strong experience in Azure Data Factory, ADB (Azure Databricks) Synapse, pyspark; establishing the cloud connectivity between different system like ADLS, ADF, Synapse, Databricks etc. A minimum of 7 years' experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partners, Minimum 7 years of troubleshooting and supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tuning, ETL importing large volume of data extracted from multiple systems, capacity planning Experience in TSQL programming along with Azure Data Factory framework and Python scripting Work well independently as well as within a team Proactive, organized, excellent analytical and problem-solving skills Flexible and willing to learn, can-do attitude is key Strong verbal and written communication skills Good-to-Have Financial institution data mart experience is an asset. Experience in .net application is an asset Experience and expertise in Tableau driven dashboard design is an asset Responsibility of / Expectations from the Role Azure Data Engineer (ADF,ADB) ETL processes using frameworks like Azure Data Factory or Synapse or Databricks; Establishing the cloud connectivity between different system like ADLS ,ADF, Synapse, Databricks etc TSQL programming along with Azure Data Factory framework and Python scripting
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Lead Technical Architect (Strategy & Optimization Data Lake & Analytics) Responsibilities: · Manage Project Delivery, scope, timelines, budget, resource allocation, and risk mitigation. · Develop and maintain robust data ingestion pipelines (batch, streaming, API). Provide architectural inputs during incident escalations and act as final authority for RCA documentation and closure. of ADF, Power BI, and Databricks · Define and enforce data governance, metadata, and quality standards across zones. · Monitor performance, optimize data formats (e.g., Parquet), and tune for cost-efficiency. Tune query performance for Databricks and Power BI datasets using optimization techniques (e.g. caching, BI Engine, materialized views). · Lead and mentor a team of data engineers, fostering skills in Azure services and DevOps. Guide schema designs for new datasets and integrations aligned with Diageo’s analytics strategy. · Coordinate cross-functional stakeholders (security, DevOps, business) for aligned execution. · Oversee incident and change management with SLA adherence and continuous improvement. Serve as the governance owner for SLA compliance, IAM policies, encryption standards, and data retention strategies. · Ensure compliance with policies (RBAC, ACLs, encryption) and regulatory audits. Initial data collection for RCA · Report project status, KPIs, and business value to senior leadership. Lead monthly and quarterly reviews, presenting insights, improvements, and roadmap alignment to Diageo stakeholders. Required Skills · Strong architecture-level expertise in Azure Data Platform (ADLS, ADF, Databricks, Synapse, Power BI). · Deep understanding of data lake zone structuring, data lineage, metadata governance, and compliance (e.g., GDPR, ISO). · Expert in Spark, PySpark, SQL, JSON, and automation tooling (ARM, Bicep, Terraform optional). · Capable of aligning technical designs with business KPIs and change control frameworks. · Excellent stakeholder communication, team mentoring, and leadership capabilities.
Posted 1 month ago
5.0 years
0 Lacs
India
Remote
Job Title: Senior BI Developer (Microsoft BI Stack) Location: REMOTE Experience: 5+ years Employment Type: Full-Time Job Summary We are looking for an experienced Senior BI Developer with strong expertise in the Microsoft BI Stack (SSIS, SSRS, SSAS) to join our dynamic team. The ideal candidate will design and develop scalable BI solutions and contribute to strategic decision-making through efficient data modeling, ETL processes, and insightful reporting. Key Responsibilities Design and develop ETL packages using SSIS for data extraction, transformation, and loading from diverse sources. Create and maintain dashboards and reports using SSRS and Power BI (if applicable). Implement and manage OLAP cubes and data models using SSAS (Multidimensional/Tabular). Develop and optimize complex T-SQL queries, stored procedures, and functions. Work closely with business analysts, data engineers, and stakeholders to gather requirements and translate them into technical solutions. Optimize BI solutions for performance and scalability. Lead BI architecture improvements and ensure efficient data flow. Ensure data quality, integrity, and consistency across systems. Mentor and support junior BI developers as needed. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, or a related field. 5+ years of hands-on experience with the Microsoft BI Stack: SSIS, SSRS, SSAS. Strong knowledge of SQL Server (2016 or later) and advanced T-SQL. Deep understanding of data warehousing concepts, including star/snowflake schemas, and fact/dimension models. Experience with Power BI is a plus. Exposure to Azure Data Services (ADF, Azure SQL, Synapse) is an added advantage. Strong analytical, troubleshooting, and problem-solving skills. Excellent verbal and written communication skills. Why Join Us? Opportunity to work on enterprise-scale BI projects. Supportive work environment with career growth potential. Exposure to modern BI tools and cloud technologies. Skills: azure,power bi,ssis,t-sql,ssrs,data,ssas,sql,sql server,azure data services
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Information Technology expert with 5+ years of banking domain knowledge. Banking application development and implementation experience across offshore & onshore model including but not limited to Solution Design, Development and Support activities. 4+ years of Application Design and Development experience with Oracle Banking Platform (OBP) product for Lending & Deposit products, Origination Workflow for online as well as batch solutions & integrations. Hands on experience on Java, J2EE, ADF, SOA, OSB, Oracle Fusion & OBP technologies. Working experience with different SDLC phases from Analysis, design, development, and production support using Agile Methodology. 3+ years of experience with Automation testing tools like Selenium Automation testing using tools like Selenium Building Framework components and Business Process Patterns Closely work with Technical Solution Architect and process Functional Lead Technologies: JAVA, J2EE, Oracle BPM, SOA, JBoss BPM, API, Microservices Key Contribution in Solution analysis & redesign of application components in stabilizing OBP platform mainly for Oracle BPM, Oracle SOA & Oracle ADF technology components. Solution Design and Implementation for Application retrofits and migration to new Oracle hardware Key Bug-fixes and solution design and review with Oracle Banking Platform product teams. Automation design in various solution components within the Involved in the implementation of Oracle BPM, OBP Host, OBP UI solution for OBP Platform. Application support and enhancements - Production issue root cause analysis and solution design for the support fixes and enhancements Multi environment deployment and testing co-ordination for live system changes At least 5+ years of banking domain knowledge. What’s in it for you? We are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our world: Our people, our culture, our voices, and our passions. What’s better than building the next big thing? It’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purpose: to navigate further, together. EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin At Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch. Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department.
Posted 1 month ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Must have: Primary Skills : 8+ years of hands-on development experience with: • C#, .NET Core 6/8+, Entity Framework / EF Core • JavaScript, jQuery, REST APIs • Expertise in MS SQL Server, including: • Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types • Skilled in unit testing with XUnit, MSTest • Strong in software design patterns, system architecture, and scalable solution design • Ability to lead and inspire teams through clear communication, technical mentorship, and ownership 2+ years of hands with Azure Cloud Services, including: • Azure Functions • Azure Durable Functions • Azure Service Bus, Event Grid, Storage Queues • Blob Storage, Azure Key Vault, SQL Azure • Application Insights, Azure Monitoring Nice to have: • Familiarity with AngularJS, ReactJS, and other front-end frameworks • Experience with Azure API Management (APIM) • Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps • Exposure to Application Support and operational monitoring • Azure DevOps - CI/CD pipelines (Classic / YAML) Working hours- Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours
Posted 1 month ago
0.0 - 10.0 years
0 Lacs
Masjid, Mumbai, Maharashtra
On-site
Bombay Mercantile Co-Operative Bank Ltd., a leading Multi-State Scheduled Bank, with 52 branches across 10 states, requires dynamic and experience personnel. Age: 45-50 Years Location: Mumbai Qualification and Experience: Graduate/Postgraduate in Computer Science, Information Systems, Data Analytics, Statistics, or a related field. Experience with BI tools such as Tableau, Power BI, or similar is an added advantage. Minimum 10–15 years of relevant experience in MIS, with at least 5 years in a leadership role, preferably in a cooperative or public sector bank. Knowledge of CBS systems, RBI reporting portals, data analytics tools, and SQL/database management essential. Key Responsibilities: 1. MIS Strategy & Planning Develop and implement an effective MIS framework to support strategic and operational objectives. Ensure integration of MIS with Core Banking System (CBS), Loan Origination System (LOS), and other internal systems for seamless data flow. 2. Data Collection, Processing & Reporting Design, standardize, and maintain reporting formats for daily, weekly, monthly, and quarterly reporting across departments. Ensure timely generation of reports for internal management, Board of Directors, auditors, and regulators. Prepare and submit statutory and compliance reports to RBI, NABARD, State Registrar, etc. 3. Regulatory & Compliance Reporting Ensure all RBI-mandated MIS submissions (e.g., CRILC, XBRL, returns under ADF, etc.) are accurate and timely. Track regulatory changes and incorporate them into reporting frameworks. 4. Performance & Operational Dashboards Develop real-time dashboards and KPIs for key functions such as credit, deposits, NPA tracking, branch performance, etc. Provide analytics support to business heads for performance analysis and forecasting. 5. Data Governance & Quality Maintain high standards of data integrity, consistency, and security across systems. Conduct regular audits and validations of MIS data to identify and correct discrepancies. 6. System Enhancement & Automation Liaise with IT department and software vendors to automate recurring reports. Support implementation of business intelligence (BI) tools, data warehouses, and report automation solutions. 7. Support to Management Assist senior management with ad-hoc analysis, strategic reviews, and Board-level presentations. Provide MIS support for product planning, regulatory inspections, audits, and business strategy. Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Masjid, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: Management Information Systems: 10 years (Preferred) Work Location: In person
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Information Technology expert with 5+ years of banking domain knowledge. Banking application development and implementation experience across offshore & onshore model including but not limited to Solution Design, Development and Support activities. 4+ years of Application Design and Development experience with Oracle Banking Platform (OBP) product for Lending & Deposit products, Origination Workflow for online as well as batch solutions & integrations. Hands on experience on Java, J2EE, ADF, SOA, OSB, Oracle Fusion & OBP technologies. Working experience with different SDLC phases from Analysis, design, development, and production support using Agile Methodology. 3+ years of experience with Automation testing tools like Selenium Automation testing using tools like Selenium Building Framework components and Business Process Patterns Closely work with Technical Solution Architect and process Functional Lead Technologies: JAVA, J2EE, Oracle BPM, SOA, JBoss BPM, API, Microservices Key Contribution in Solution analysis & redesign of application components in stabilizing OBP platform mainly for Oracle BPM, Oracle SOA & Oracle ADF technology components. Solution Design and Implementation for Application retrofits and migration to new Oracle hardware Key Bug-fixes and solution design and review with Oracle Banking Platform product teams. Automation design in various solution components within the Involved in the implementation of Oracle BPM, OBP Host, OBP UI solution for OBP Platform. Application support and enhancements - Production issue root cause analysis and solution design for the support fixes and enhancements Multi environment deployment and testing co-ordination for live system changes At least 5+ years of banking domain knowledge. What’s in it for you? We are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our world: Our people, our culture, our voices, and our passions. What’s better than building the next big thing? It’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purpose: to navigate further, together. EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin At Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch. Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department.
Posted 1 month ago
0 years
0 Lacs
India
On-site
Avensys is a reputed global IT professional services company headquartered in Singapore. Our service spectrum includes enterprise solution consulting, business intelligence, business process automation and managed services. Given our decade of success, we have evolved to become one of the top trusted providers in Singapore and service a client base across banking and financial services, insurance, information technology, healthcare, retail and supply chain. * Singapore Onsite *looking for only short term notice candidates Job Description: We are looking for an experienced Oracle IAM Consultant expertise to join our dynamic team Design and Implementation: Designing and implementing IAM solutions using OIM /OIG Developing custom connectors to integrate OIM with various applications and systems. Building and configuring OIM workflows, approval policies, and entitlements. Developing custom UI components for OIM self-service pages. Skills and Experience: Experienced in an end-to-end integration of IAM Solution using Oracle Identity governance. Prior experience with requirement gathering, analysis, design, development, maintenance, and upgrades in different environments like DEV, QA, UAT, PROD. Experience with ICF Based Framework connector to integrate with target applications and perform CRUD Operations and managing roles to the Target system. Extended hands-on experience with custom code development such as Event Handlers, Validation Plugin and Schedule Tasks using Java API. Experience with Audit reports with OIM BI Publisher and customized the logo and header of the UI Screen and audit reports. Implement Oracle ADF customizations for user interfaces. Build custom Oracle SOA composites for workflows. Java Experience: Best practice based secure java development Exposure and hands on experience with REST APIs and web services Ability to re-use existing code and extend frameworks Administration and Management: Administering and managing OIM environments. Ensuring the IAM platform is secure, scalable, and supports business requirements. Monitoring the performance and health of IAM systems. Security and Compliance: Developing and enforcing IAM policies and procedures. Collaborating with security teams to address vulnerabilities. Support and Troubleshooting: Supporting end-users with access-related issues and requests. Troubleshooting and resolving technical issues related to OIM implementation. Good to Have: Hands-on experience with Oracle Access manager Good understanding of AS400 and relevant infrastructure Unix Scripting String SQL knowledge WHAT’S ON OFFER You will be remunerated with an excellent base salary and entitled to attractive company benefits. Additionally, you will get the opportunity to enjoy a fun and collaborative work environment, alongside a strong career progression. To submit your application, please apply online or email your UPDATED CV to swathi@aven-sys.com Your interest will be treated with strict confidentiality. CONSULTANT DETAILS: Consultant Name : Swathi Avensys Consulting Pte Ltd EA Licence 12C5759 Privacy Statement: Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and Avensys' privacy policy
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title TECHNICAL ANALYST Job Description Job Title: ADF Data Engineer Responsibilities: Experience: 5 to 10 Yrs of Experience. • Convert Workato recipes into Azure Data Factory (ADF) pipelines to facilitate seamless data integration. • Design, develop, and maintain ADF pipelines to connect and orchestrate data flow between Snowflake and Salesforce. • Collaborate with cross-functional teams to understand data requirements and ensure efficient data integration. • Optimize data pipelines for performance, scalability, and reliability. • Implement data quality checks and monitoring to ensure data accuracy and consistency. • Troubleshoot and resolve issues related to data integration and pipeline performance. • Document data integration processes and maintain up-to-date technical documentation. Qualifications: • Bachelor's degree in Computer Science, Information Technology, or a related field. • Proven experience as a Data Engineer, with a focus on Azure Data Factory, data pipelines and data integration. • Strong proficiency in SQL and experience working with Snowflake and Salesforce. • Knowledge of ETL/ELT processes and best practices. • Familiarity with data warehousing concepts and cloud-based data solutions. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills. Preferred Qualifications: • Experience with other Azure services such as Azure Data Lake, Azure Synapse Analytics, and Azure Functions. • Certification in Azure Data Engineering or related fields. • Experience with version control systems like Git. • Experience with Workato or similar integration platforms.
Posted 1 month ago
8.0 years
2 - 4 Lacs
Bengaluru
On-site
If you are looking for a challenging and exciting career in the world of technology, then look no further. Skyworks is an innovator of high performance analog semiconductors whose solutions are powering the wireless networking revolution. At Skyworks, you will find a fast-paced environment with a strong focus on global collaboration, minimal layers of management and the freedom to make meaningful contributions in a setting that encourages creativity and out-of-the-box thinking. We are excited about the opportunity to work with you and glad you want to be part of a team of talented individuals who together can change the way the world communicates. Requisition ID: 75243 Header If you are looking for a challenging and exciting career in the world of technology, then look no further. Skyworks is an innovator of high performance analog semiconductors whose solutions are powering the wireless networking revolution. At Skyworks, you will find a fast-paced environment with a strong focus on global collaboration, minimal layers of management and the freedom to make meaningful contributions in a setting that encourages creativity and out-of-the-box thinking. Our work culture values diversity, social responsibility, open communication, mutual trust and respect. We are excited about the opportunity to work with you and glad you want to be part of a team of talented individuals who together can change the way the world communicates. Requisition ID: 75243 Description We are seeking a highly skilled and experienced Sr. Principal Enterprise Integration Architect to join our team. The ideal candidate will have a strong background in enterprise integration architecture and extensive experience working with global teams and people. This role is critical in ensuring that all applications company-wide are managed into a world-class portfolio. The Sr. Principal Enterprise Integration Architect will play a pivotal role in designing, architecting, developing, and supporting integration solutions globally. Responsibilities Lead the design and implementation of enterprise integration solutions using Azure iPaaS or similar middleware tools. Collaborate with global teams to ensure seamless integration of applications across the company. Develop and maintain integration architecture standards and best practices. Manage the integration portfolio, ensuring all applications are aligned with the company's strategic goals. Provide technical leadership and guidance to the integration team. Oversee the development and support of integration solutions, ensuring high availability and performance. Conduct regular reviews and assessments of integration solutions to identify areas for improvement. Work closely with stakeholders to understand business requirements and translate them into technical solutions. Ensure compliance with security and regulatory requirements in all integration solutions. Required Experience and Skills Minimum of 8 years of experience in enterprise architecture, integration, software development or a related field. Exposure to native cloud platforms such as Azure, AWS, or GCP and experience with them at scale. Integration and Data Feeds: Support and maintain existing integration and data feeds, ensuring seamless data flow and system integration. Azure iPaaS Development: (or similar middleware product) : Design, develop, and maintain applications using Azure Integration Platform as a Service (iPaaS) components such as Logic Apps, Azure Data Factory (ADF), and Function Apps & APIM SQL and ETL Processes: Develop and optimize SQL queries and manage database systems for ETL processes. DevOps Management: Implement and manage DevOps pipelines using Git, Jenkins, Azure DevOps, and GitHub. Support and maintain Jenkins servers, Azure DevOps, and GitHub. Proven experience working with global teams and managing cross-functional projects. Excellent understanding of integration design principles and best practices. Strong leadership and communication skills. Ability to manage multiple projects and priorities simultaneously. Experience with integration tools and platforms such as API management, ESB, and ETL. Knowledge of security and regulatory requirements related to integration solutions. Strong problem-solving and analytical skills. Desired Experience and Skills Experience in managing large-scale integration projects. Familiarity with other cloud platforms and technologies. Knowledge of DevOps practices and tools. Experience with Agile methodologies. Certification in Azure or related technologies. Strong understanding of business processes and how they relate to integration solutions. Skyworks is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other characteristic protected by law.
Posted 1 month ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle Netsuite to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: 1. Lead a team of NetSuite developers, providing guidance, mentorship, and technical expertise to ensure high-quality deliverables and project success. 2. Define technical architecture and design standards for NetSuite solutions, ensuring scalability, performance, and maintainability. 3. Stay updated on emerging technologies and best practices in NetSuite development, driving innovation and continuous improvement within the team. 4. Manage end-to-end technical projects for NetSuite implementations, upgrades, and customizations, ensuring adherence to scope, budget, and timeline. 5. Develop project plans, resource allocation strategies, and risk mitigation plans, and monitor project progress to identify and address issues proactively. 6. Lead the development and customization of NetSuite solutions, including Suite Scripting, Suite Flow, Suite Builder, and Suite Cloud development. 7. Collaborate with functional consultants to translate business requirements into technical solutions, ensuring alignment with best practices and industry standards. 8. Serve as a technical liaison between the development team and clients, providing technical expertise, addressing concerns, and managing expectations. 9. Participate in client meetings and workshops to understand their technical requirements, propose solutions, and provide updates on project status. o Mandatory Skill Sets: Netsuite *Preferred skill sets Netsuite Qualifications: 1. Bachelor’s degree in computer science, Information Technology, or related field. 2. 2years of hands-on experience in NetSuite development, customization, and integration. 3 Expertise in NetSuite Suite Script, Suite Flow, Suite Builder, and Suite Cloud development platforms. 5. NetSuite certifications such as Suite Foundation, Suite Cloud Developer, or Suite Commerce Advanced are highly desirable. *Years of experience required • Minimum 2Years of Netsuite expert *Education Qualification • Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Netsuite Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage, Relationship Building, Self-Awareness {+ 4 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 month ago
4.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
Remote
FasCave IT Solutions Pvt. Ltd. is looking for a Data Engineer to join our team on a 10-12 months contract basis for one of the best healthcare companies as a client. If you have expertise in data modeling and engineering, this opportunity is for you! Position: Data Engineer Location: Remote Duration: 10-12 Months (Contract) Experience: 4-10 Years Shift Time: Australian Shift (5 AM TO 1 PM IST) Key Requirements: Strong SQL skills Snowflake Azure Data Factory (ADF) Power BI SSIS (Nice to have) 📩 How to Apply? Send your resume to hrd@fascave.com with the following details: - Years of Experience - Current CTC - Expected CTC - Earliest Joining Date
Posted 1 month ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: Senior Data Engineer Experience: 10+ years Location: Bangalore | Gurgaon Notice Period: Immediate Joiners Only Job Description – Data Engineer (Azure, ADF, Databricks, PySpark, SCD, Unity Catalog, SQL) Required Skills & Qualifications: 6+ years of experience in Data Engineering with a focus on Azure technologies. Expertise in Azure Data Factory (ADF) & Azure Databricks for ETL/ELT workflows. Strong knowledge of Delta Tables & Unity Catalog for efficient data storage and management. Experience with Slowly Changing Dimensions (SCD2) implementation in Delta Lake. Proficiency in PySpark for large-scale data processing & transformation. Hands-on experience with SQL & performance tuning for data pipelines. Understanding of data governance, security, and compliance best practices in Azure. Knowledge of CI/CD, DevOps practices for data pipeline automation. Preferred Qualifications: Experience with Azure Synapse Analytics, Data Lakes, and Power BI integration . Knowledge of Kafka or Event Hub for real-time data ingestion. Certifications in Microsoft Azure (DP-203, DP-900) or Databricks are a plus.
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a great Software Development Engineer do? As Software Development Engineer your focus will be on applying the principles of engineering to software development. The role focuses on the complex and large software systems that make up the core systems for the organization. You will be responsible for developing, unit testing, and integration tasks working within this highly visible-client focused web services application. Development efforts will also include feature enhancements, client implementations, and bug fixes as well as support of the production environment. What You Will Do Collaborate within a team environment in the development, testing, and support of software development project lifecycles. Develop web interfaces and underlying business logic. Prepare any necessary technical documentation. Track and report daily and weekly activities. Participate in code reviews and code remediation. Perform and develop proper unit tests and automation. Participate in a 24 hour on-call rotation to support previous releases of the product. Research problems discovered by QA or product support and develop solutions to the problems. Perform additional duties as determined by business needs and as directed by management. What You Will Need To Have Bachelor’s degree in Computer Science, Engineering or Information Technology, or equivalent experience. 3-5 years of experience in developing scalable and secured J2EE applications. Excellent knowledge in Java based technologies (Core Java, JSP, AJAX, JSF, EJB, and Spring Framework), Oracle SQL/PLSQL and App servers like WebLogic, JBOSS. Excellent knowledge in SOAP & REST web service implementations. Knowledge in UNIX environment is preferred. Experience in JSF UI components (Oracle ADF & Rich Faces) technology is preferred. Good analytical, organizational, and problem-solving abilities. Good at prioritizing the tasks and commitment to complete them. Strong team player / customer service orientation. Demonstrated ability to work with both end users and technical staff. Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and Works well in a fast paced, fluid and dynamic environment. What Would Be Great To Have Experience working in a Scrum Development Team Banking and Financial Services experience Java Certifications. Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 month ago
5.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Preferred Education Master's Degree Required Technical And Professional Expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred Technical And Professional Experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences
Posted 1 month ago
5.0 years
0 Lacs
Hyderābād
On-site
Job Information Date Opened 06/19/2025 Job Type Full time Work Experience 5+ years Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500032 Job Description As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 1 month ago
7.0 years
3 - 6 Lacs
Hyderābād
On-site
As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 1 month ago
15.0 years
3 - 8 Lacs
Hyderābād
On-site
What You Will Do: As a Data Governance Architect at Kanerika, you will play a pivotal role in shaping and executing the enterprise data governance strategy. Your responsibilities include: 1. Strategy, Framework, and Governance Operating Model Develop and maintain enterprise-wide data governance strategies, standards, and policies. Align governance practices with business goals like regulatory compliance and analytics readiness. Define roles and responsibilities within the governance operating model. Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement Collaborate across IT, legal, business, and compliance teams to align governance priorities. Define stewardship models and create enablement, training, and communication programs. Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms Design scalable and modular data governance architecture. Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture Lead end-to-end implementation and management of Microsoft Purview. Configure RBAC, collections, metadata scanning, business glossary, and classification rules. Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management Architect metadata repositories and ingestion workflows. Ensure end-to-end lineage (ADF Synapse Power BI). Define governance over business glossary and approval workflows. 6. Data Classification, Access & Policy Management Define and enforce rules for data classification, access, retention, and sharing. Align with GDPR, HIPAA, CCPA, SOX regulations. Use Microsoft Purview and MIP for policy enforcement automation. 7. Data Quality Governance Define KPIs, validation rules, and remediation workflows for enterprise data quality. Design scalable quality frameworks integrated into data pipelines. 8. Compliance, Risk, and Audit Oversight Identify risks and define standards for compliance reporting and audits. Configure usage analytics, alerts, and dashboards for policy enforcement. 9. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance tools with Azure Monitor, Synapse Link, Power BI, and third-party platforms. Requirements 15+ years in data governance and management. Expertise in Microsoft Purview, Informatica, and related platforms. Experience leading end-to-end governance initiatives. Strong understanding of metadata, lineage, policy management, and compliance regulations. Hands-on skills in Azure Data Factory, REST APIs, PowerShell, and governance architecture. Familiar with Agile methodologies and stakeholder communication. Benefits 1. Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 1 month ago
0 years
6 - 10 Lacs
Bengaluru
On-site
Analyze, design develop, and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Preferred Qualifications: Oracle Applications Lab (OAL) has a central role within Oracle. It's role is to work with Product Development and Oracle internal business to deliver Oracle products for Oracle to use internally. OAL has a role of implementing Oracle applications, databases and middleware, supporting Oracle applications for Oracle internally and configuring Oracle applications to meet the specific needs of Oracle. OAL also provides a showcase for Oracle's products The role will involve: Working as part of a global team to implement and support new business applications for HR and Payroll Debugging and solving sophisticated problems and working closely with Oracle Product Development and other groups to implement solutions Developing and implementing product extensions and customizations Testing new releases Providing critical production support Your skills should include: Experience in designing and supporting Oracle E-Business Suite and Fusion applications. Preferably Oracle HRMS/Fusion HCM Strong Oracle technical skills: SQL, PL/SQL, Java, XML, ADF, SOA etc Communicating confidently with peers and management within technical and business teams Detailed Description and Job Requirements: Work with Oracle's world class technology to develop, implement, and support Oracle's global infrastructure. As a member of the IT organization, help analyze existing complex programs and formulate logic for new complex internal systems. Prepare flowcharting, perform coding, and test/debug programs. Develop conversion and system implementation plans. Recommend changes to development, maintenance, and system standards. Job duties are varied and complex using independent judgment. May have project lead role. BS or equivalent experience in programming on enterprise or department servers or systems.
Posted 1 month ago
3.0 - 5.0 years
7 - 10 Lacs
Chennai
On-site
Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10363786 Date posted 06/20/2025 End Date 06/26/2025 City Chennai State/Region Tamil Nadu Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a great Software Development Engineer do? As Software Development Engineer your focus will be on applying the principles of engineering to software development. The role focuses on the complex and large software systems that make up the core systems for the organization. You will be responsible for developing, unit testing, and integration tasks working within this highly visible-client focused web services application. Development efforts will also include feature enhancements, client implementations, and bug fixes as well as support of the production environment. What you will do: Collaborate within a team environment in the development, testing, and support of software development project lifecycles. Develop web interfaces and underlying business logic. Prepare any necessary technical documentation. Track and report daily and weekly activities. Participate in code reviews and code remediation. Perform and develop proper unit tests and automation. Participate in a 24 hour on-call rotation to support previous releases of the product. Research problems discovered by QA or product support and develop solutions to the problems. Perform additional duties as determined by business needs and as directed by management. What you will need to have: Bachelor’s degree in Computer Science, Engineering or Information Technology, or equivalent experience. 3-5 years of experience in developing scalable and secured J2EE applications. Excellent knowledge in Java based technologies (Core Java, JSP, AJAX, JSF, EJB, and Spring Framework), Oracle SQL/PLSQL and App servers like WebLogic, JBOSS. Excellent knowledge in SOAP & REST web service implementations. Knowledge in UNIX environment is preferred. Experience in JSF UI components (Oracle ADF & Rich Faces) technology is preferred. Good analytical, organizational, and problem-solving abilities. Good at prioritizing the tasks and commitment to complete them. Strong team player / customer service orientation. Demonstrated ability to work with both end users and technical staff. Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and Works well in a fast paced, fluid and dynamic environment. What would be great to have: Experience working in a Scrum Development Team Banking and Financial Services experience Java Certifications. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France