Jobs
Interviews

1432 Adf Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Project Management Professionals in the following areas : Technical skills Should have 15 + years of working experience handling end to end DWH projects. Experience handling ETL Migration / Visualization projects includes technologies like AWS Glue /Redshift, Power BI/Tableau, Azure ADF/Data bricks, Lead technical design and architecture discussions across cross-functional teams Oversee software requirements (including design, architecture, and testing) Manage through agile methodologies, such as Scrum Decipher technical needs of other departments within the organization and translate them across stakeholder groups. Leadership skills Act as a communications liaison between technical and non-technical audiences Develop and maintain productive internal relationships Facilitate cross-collaboration and understanding between IT and other departments Generate targeted reports for different internal and/or external audiences Stay current on the latest news, information, and trends about program management and the organization’s industry. Business responsibilities Organize and track jobs, clarify project scopes, proactively manage risks, deal with project escalations, ruthlessly prioritize tasks and dependencies, and problem solve Meet specific business objectives and metrics Support the roadmap planning process Develop strategies and implement tactics to follow through on those strategies Solve complex business problems within allocated timelines and budget Represent company management to technical teams and vice versa At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Description: Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. The Bank's Data Store has been transformed to a Data warehouse (DWH) which is the central source for Regulatory Reporting. It is also intended to be the core data integration platform which not only provide date for regulatory reporting but also provide data for Risk Modelling, Portfolio Analysis, Ad Hoc Analysis & Reporting (Finance, Risk, other), MI Reporting, Data Quality Management, etc. Due to high demand of regulatory requirements, a lot of regulatory projects are in progress to reflect regulatory requirements on existing regulatory reports and to develop new regulatory reports on MDS. Examples are IFRS9, AnaCredit, IRBB, the new Deposit Guarantee Directive (DGSD), Bank Data Retrieval Portal (BDRP) and the Fundamental Review of the Trading Book (FRTB). DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities: Testing the Bank's data warehouse system changes, testing the changes (user stories), support IT integration testing in TST and support business stakeholders with User Acceptance Testing. It is hands-on position: you will be required to write and execute test cases, build test automation where it is applicable. Overall Purpose of Job - Test the MDS data warehouse system - Validate regulatory reports - Supporting IT and Business stakeholders during UAT phase - Contribute to improvement of testing and development processes - Work as part of a cross-functional team and take ownership of tasks - Contribute in Testing Deliverables. - Ensure the implementation of test standards and best practices for the agile model & contributes to their development. - Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. - Deals with external stakeholders / Vendors. - Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. - Contribute to continuous improvement of testing standard processes. Additional responsibilities include work closely with the systems analysts and the application developers, utilize functional design documentation and technical specifications to facilitate the creation and execution of manual and automated test scripts, perform data analysis and creation of test data, track and help resolve defects and ensure that all testing is conducted and documented in adherence with the bank's standard. Mandatory Skills Description: Must have experience/expertise : Tester, Test Automation, Data Warehouse, Banking Technical: - At least 5 years of testing experience of which at least 2 years in the finance industry with good level knowledge on Data Warehouse, RDBMS concepts. - Strong SQL scripting knowledge and hands-on experience and experience with ETL & Databases. - Expertise on new age cloud based Data Warehouse solutions - ADF, SnowFlake, GCP etc. - Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. - Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. - Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. - Knowledge of framework designing, REST API Testing of databases using Python. - Experience using Atlassian tool set, Azure DevOps and code & Version Management - GIT, Bitbucket, Azure Repos etc. - Help and provide inputs for the creation of Test Plan to address the needs of Cloud Based ETL Pipelines. Non-Technical: - Able to work in an agile environment - Experience in working in high priority projects (high pressure on delivery) - Some flexibility outside 9-5 working hours (Netherlands Time zone) - Able to work in demanding environment and have pragmatic approach with "can do" attitude. - Able to work independently and also to collaborate across the organization - Highly developed problem-solving skills with minimal supervision - Able to easily adapt to new circumstances / technologies / procedures. - Stress resistant and constructive - whatever the context. - Able to align with existing standards and acting with attention to detail.

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Job Family Data Science & Analysis (India) Travel Required None Clearance Required None What You Will Do Design, develop, and maintain robust, scalable, and efficient data pipelines and ETL/ELT processes. Lead and execute data engineering projects from inception to completion, ensuring timely delivery and high quality. Build and optimize data architectures for operational and analytical purposes. Collaborate with cross-functional teams to gather and define data requirements. Implement data quality, data governance, and data security practices. Manage and optimize cloud-based data platforms (Azure\AWS). Develop and maintain Python/PySpark libraries for data ingestion, Processing and integration with both internal and external data sources. Design and optimize scalable data pipelines using Azure data factory and Spark(Databricks) Work with stakeholders, including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Develop frameworks for data ingestion, transformation, and validation. Mentor junior data engineers and guide best practices in data engineering. Evaluate and integrate new technologies and tools to improve data infrastructure. Ensure compliance with data privacy regulations (HIPAA, etc.). Monitor performance and troubleshoot issues across the data ecosystem. Automated deployment of data pipelines using GIT hub actions \ Azure devops What You Will Need Bachelors or master’s degree in computer science, Information Systems, Statistics, Math, Engineering, or related discipline. Minimum 5 + years of solid hands-on experience in data engineering and cloud services. Extensive working experience with advanced SQL and deep understanding of SQL. Good Experience in Azure data factory (ADF), Databricks , Python and PySpark. Good experience in modern data storage concepts data lake, lake house. Experience in other cloud services (AWS) and data processing technologies will be added advantage. Ability to enhance , develop and resolve defects in ETL process using cloud services. Experience handling large volumes (multiple terabytes) of incoming data from clients and 3rd party sources in various formats such as text, csv, EDI X12 files and access database. Experience with software development methodologies (Agile, Waterfall) and version control tools Highly motivated, strong problem solver, self-starter, and fast learner with demonstrated analytic and quantitative skills. Good communication skill. What Would Be Nice To Have AWS ETL Platform – Glue , S3 One or more programming languages such as Java, .Net Experience in US health care domain and insurance claim processing. What We Offer Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. About Guidehouse Guidehouse is an Equal Opportunity Employer–Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at 1-571-633-1711 or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or guidehouse@myworkday.com. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse’s Ethics Hotline. If you want to check the validity of correspondence you have received, please contact recruiting@guidehouse.com. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant’s dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.

Posted 4 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Azure Certified AI Engineer / Data Scientist Experience: 4–6 Years Engagement: Contract to Hire (C2H) Location: Pune (Onsite – 5 Days a Week) Company: Optimum Data Analytics (ODA) About Optimum Data Analytics (ODA) Optimum Data Analytics is a fast-growing data and AI consulting firm delivering innovative solutions to enterprise clients across industries. We specialize in data engineering, machine learning, and AI/GenAI-based platforms on cloud ecosystems. Role Overview We are looking for an Azure Certified AI Engineer or Data Scientist with 4–6 years of experience to join our Pune office on a full-time, onsite C2H engagement. The ideal candidate should be hands-on with building and deploying AI/ML solutions using Azure cloud services, and must hold an active Azure AI Engineer Associate or Azure Data Scientist Associate certification. Key Responsibilities Design and deploy AI/ML models using Azure AI/ML Studio, Azure Machine Learning, and Azure Cognitive Services. Implement and manage data pipelines, model training workflows, and ML lifecycle in the Azure ecosystem. Work with business stakeholders to gather requirements, analyze data, and deliver predictive insights. Collaborate with data engineers and product teams to deliver scalable and production-ready AI solutions. Ensure model monitoring, versioning, governance, and responsible AI practices are in place. Contribute to solution documentation and technical architecture. Required Skills & Qualifications 4–6 years of hands-on experience in AI/ML, data science, or machine learning engineering. Mandatory Certification: Microsoft Azure AI Engineer Associate OR Microsoft Azure Data Scientist Associate Strong knowledge of Azure services: Azure Machine Learning, Cognitive Services, Azure Functions, Data Factory, and Azure Storage. Proficient in Python, with experience using ML libraries such as scikit-learn, TensorFlow, PyTorch, or similar. Solid understanding of data science lifecycle, model evaluation, and performance optimization. Experience with version control tools like Git and deployment through CI/CD pipelines. Excellent problem-solving and communication skills. Good To Have Familiarity with LLMs, prompt engineering, or GenAI tools (Azure OpenAI, Hugging Face). Experience with Power BI or other data visualization tools. Exposure to MLOps tools and practices. Skills: machine learning,azure,scikit-learn,open ai,pytorch,azure machine learning,cognitive services,git,azure ai engineer associate,python,data science,tensorflow,communication,azure functions,azure storage,adf,data factory,artificial intelligence,azure data scientist associate,problem-solving,ci/cd pipelines,mlops

Posted 4 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Role Overview As a Data Governance Developer at Kanerika, you will be responsible for developing and managing robust metadata, lineage, and compliance frameworks using Microsoft Purview and other leading tools. Youll work closely with engineering and business teams to ensure data integrity, regulatory compliance, and operational transparency. Key Responsibilities Set up and manage Microsoft Purview: accounts, collections, RBAC, and policies. Integrate Purview with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule & monitor metadata scanning, classification, and lineage tracking jobs. Build ingestion workflows for technical, business, and operational metadata. Tag, enrich, and organize assets with glossary terms and metadata. Automate lineage, glossary, and scanning processes via REST APIs, PowerShell, ADF, and Logic Apps. Design and enforce classification rules for PII, PCI, PHI. Collaborate with domain owners for glossary and metadata quality governance. Generate compliance dashboards and lineage maps in Power BI. Tools & Technologies Governance Platforms: Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog Integration Tools: Azure Data Factory, dbt, Talend Automation & Scripting: PowerShell, Azure Functions, Logic Apps, REST APIs Compliance Areas in Purview: Sensitivity Labels, Policy Management, Auto-labeling Data Loss Prevention (DLP), Insider Risk Mgmt, Records Management Compliance Manager, Lifecycle Mgmt, eDiscovery, Audit DSPM, Information Barriers, Unified Catalog Required Qualifications 46 years of experience in Data Governance / Data Management. Hands-on with Microsoft Purview, especially lineage and classification workflows. Strong understanding of metadata management, glossary governance, and data classification. Familiarity with Azure Data Factory, dbt, Talend. Working knowledge of data compliance regulations: GDPR, CCPA, SOX, HIPAA. Strong communication skills to collaborate across technical and non-technical teams.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Dataops Engineer Location: Hyderabad Experience: 5-10 Years Required Skills Bachelor's degree in Computer Science, Information Systems, or a related field. 5-10 years of experience in Data Engineering or DataOps roles. Strong hands-on experience with: ADF, ADLS, Snowflake, Azure DevOps or similar CI/CD platforms Proficient in SQL and scripting languages such as Python.. Solid understanding of ETL/ELT concepts and data warehousing. Experience with source control (e.g., Git) and infrastructure as code (e.g., ARM templates, Terraform). Knowledge of data security best practices in cloud environments.

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Description Bachelor's degree in Computer Science, Software Engineering, or a related field. 5+ years of experience as a Full Stack Developer. Strong proficiency in ReactJS, Next.js, C#, .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Responsibilities: Development: Contribute to the development of robust and scalable applications using our primary tech stack, including ReactJS, Next.js, C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives. Job Description Bachelor's degree in Computer Science, Software Engineering, or a related field. 5+ years of experience as a Full Stack Developer. Strong proficiency in ReactJS, Next.js, C#, .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Responsibilities: Development: Contribute to the development of robust and scalable applications using our primary tech stack, including ReactJS, Next.js, C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives.

Posted 4 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

TCS Hiring for Azure FinOps Experience: 8 to 10 Years Only Job Location: Kolkata/Pune TCS Hiring for Azure FinOps Required Technical Skill Set: As a Cloud FinOps consultant, you are responsible for developing and implementing a robust program for cloud cost management that includes: a service-based cost allocation and classification strategy, tracking and management of cloud cost, cloud services rates setting, and defining consumption and show back/chargeback reports from both the provider and consumer perspective. Experienced Cloud FinOps Practitioner who can work on mainly Azure along with AWS platforms for Cloud Optimization & Cost savings (Certification as a FinOps Practitioner will have an added advantage) Create cloud cost optimization framework and Governance mechanism. Who takes ownership in Cost Analysis, Reviewing Recommendations, creating budgeting alerts, purchasing reservations / savings plan, tagging, anomaly detection and forecasting spend. Experienced in rightsizing of computer services, identifying unused resources and AHUB implementation. Define and setup the Cloud Spend governance process and Cloud Spend Tracking mechanism. Experience in driving deep architectural discussions to help customers ensure they are making the most cost-efficient cloud usage choices. Interacting with vendors on Enterprise Agreements, MCA, Discounts and various other optimization opportunities. Creating dashboards for visualizing monthly, yearly cost by adding various filters. Drive a close working relationships with IT teams, Finance teams, Architects, Operations Knowledge of IaaS, PaaS services and cloud technologies, for example Databricks, AKS, ADF, Log Analytics, Load Balancing, Disaster recovery. Knowledge of app & data architectures and cloud native patterns for development, etc. Kind Regards, Priyankha M

Posted 4 weeks ago

Apply

0 years

25 Lacs

India

On-site

Work Location: Kochi/Trivandrum 10-15+ Yrs Experience ▪ Expertise in Azure services including App Services, Functions, DevOps pipelines and ADF ▪ Expert-level knowledge of MuleSoft Anypoint Platform, API lifecycle management, and enterprise integration ▪ Proven skills in Java-based web applications with RDBMS or NoSQL backends ▪ Proven skills in Python and JavaScript for backend and full-stack solutions ▪ Deep understanding of object-oriented programming and design patterns ▪ Solid hands-on experience with RESTful API development and microservices architecture ▪ Familiarity with unit testing frameworks (e.g., TestNG) and integration testing in Java ▪ Experience with code review processes, test coverage validation, and CI/CD pipelines ▪ Proficiency in Git, SVN, and other version control systems ▪ Comfortable working with static code analysis tools and agile methodologies ▪ Good knowledge of JIRA, Confluence, and project collaboration tools ▪ Strong communication skills and ability to mentor team members ▪ Ability to prepare detailed technical documentation and design specifications ▪ Passion for clean code, best practices, and scalable architecture ▪ Nice to have: experience with identity providers like Auth0, Keycloak, IdentityServer. ▪ Take ownership of tasks and user stories; provide accurate estimations ▪ Provide technically sound advice/decisions on how to develop functionality to implement client’s business logic in software systems ▪ Lead sprint activities, including task management and code reviews ▪ Design and implement secure, scalable, and maintainable solutions ▪ Conduct technical discovery and performance analysis of critical systems ▪ Write low-level design and as-built documents ▪ Translate business requirements into technical specifications and working code ▪ Develop and maintain unit, integration, and regression test cases ▪ Ensure adherence to TDD and promote high code coverage ▪ Integrate multiple data sources and systems with dissimilar structures ▪ Contribute to overall system design with focus on security and resilience ▪ Use static code analysis tools and set up CI/CD pipelines ▪ Participate in Agile ceremonies: grooming, planning, stand-ups, and retrospectives ▪ Collaborate with technical architects on solution design ▪ Mentor junior team members on coding standards, sprint tasks, and technology ▪ Troubleshoot, test, and optimize core software and databases ▪ Stay updated with industry trends and promote best practices within the team. ▪ Identify challenges, initiate PoCs, and perform feasibility studies. Participate in the full product development cycle, including brainstorming, release planning and estimation, implementing and iterating on code, coordinating with internal and external clients, internal code and design reviews, MVP and production releases, quality assurance, and product support. Highly effective and thrive in a dynamic environment. Comfortable with proactive outward communication and technical leadership and positive about accepting challenges. Adhere to ISMS Policies and procedures. Job Type: Full-time Pay: From ₹2,500,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Work Location: In person

Posted 4 weeks ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Greetings from TCS!!! TCS is hiring for Snowflake Tech Architect / Tech Lead Experience: 10+ years Location: Chennai/Bangalore/Mumbai Required Technical Skill Set 10 years of total experience and at least 3+ years of expertise in Cloud data warehouse technologies on Snowflake and AWS, Azure or GCP. At least one End-to-end Snowflake implementation is a must covering all aspects including architecture, design, data engineering, data visualization and data governance (specifically data quality and lineage). Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses and Data Marts. Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. Significant experience with data migrations and development, design, Operational Data Stores, Enterprise Data Warehouses and Data Marts. Experience with cloud ETL and ELT in one of the tools like DBT/Glue/ADF or Matillion or any other ELT tool and exposure to Bigdata ecosystem (Hadoop). Expertise with at least one of the Traditional data warehouses solutions on Oracle, Teradata, and Microsoft SQL server. Excellent communication skills to liaise with Business & IT stakeholders. Expertise in planning execution of a project and efforts estimation. Understanding of Data Vault, data mesh and data fabric architecture patterns. Exposure to working in Agile ways of working. Must-Have Experience on cloud services like S3/Blob/GCS, Lambda, Glue/ADF/, Apache Airflow. Experience or understanding of Banking and Financial Services business domain. Good-to-Have Experience in coding languages like Python and PySpark would be an added advantage. Experience in DevOps, CI/CD, GitHub is a big plus. Responsibility of / Expectations from the Role 1.Provide Technical pre-sales enablement on data on cloud aspects covering data architecture, data engineering, data modelling, data consumption and data governance aspects focusing on Snowflake. 2.Expert level knowledge on Snowflake Data engineering, performance, consumption, security, governance and admin aspects 3.To work with cross-functional teams in onsite/offshore setup and discuss and solve technical problems with various stakeholders including customer teams. 4.Creating technical proposals and responding to large scale RFPs. 5.Have discussion on existing solution, design/optimize solution and prepare execution planning for development, deployment and enabling end users to utilize the data platform. 6.Role demands excellent oral and written communication skills to organize workshops, meetings with account teams, account leadership and senior stakeholders from client including CXO level. 7.Adept in creating POV and conduct PoC. 8.Liaise with Technology partners like Snowflake, Matillion, DBT etc.,

Posted 4 weeks ago

Apply

45.0 - 50.0 years

0 Lacs

India

On-site

Bombay Mercantile Co-Operative Bank Ltd., a leading Multi-State Scheduled Bank, with 52 branches across 10 states, requires dynamic and experience personnel. Age: 45-50 Years Location: Mumbai Qualification and Experience: Graduate/Postgraduate in Computer Science, Information Systems, Data Analytics, Statistics, or a related field. Experience with BI tools such as Tableau, Power BI, or similar is an added advantage. Minimum 10–15 years of relevant experience in MIS, with at least 5 years in a leadership role, preferably in a cooperative or public sector bank. Knowledge of CBS systems, RBI reporting portals, data analytics tools, and SQL/database management essential. Key Responsibilities: 1. MIS Strategy & Planning Develop and implement an effective MIS framework to support strategic and operational objectives. Ensure integration of MIS with Core Banking System (CBS), Loan Origination System (LOS), and other internal systems for seamless data flow. 2. Data Collection, Processing & Reporting Design, standardize, and maintain reporting formats for daily, weekly, monthly, and quarterly reporting across departments. Ensure timely generation of reports for internal management, Board of Directors, auditors, and regulators. Prepare and submit statutory and compliance reports to RBI, NABARD, State Registrar, etc. 3. Regulatory & Compliance Reporting Ensure all RBI-mandated MIS submissions (e.g., CRILC, XBRL, returns under ADF, etc.) are accurate and timely. Track regulatory changes and incorporate them into reporting frameworks. 4. Performance & Operational Dashboards Develop real-time dashboards and KPIs for key functions such as credit, deposits, NPA tracking, branch performance, etc. Provide analytics support to business heads for performance analysis and forecasting. 5. Data Governance & Quality Maintain high standards of data integrity, consistency, and security across systems. Conduct regular audits and validations of MIS data to identify and correct discrepancies. 6. System Enhancement & Automation Liaise with IT department and software vendors to automate recurring reports. Support implementation of business intelligence (BI) tools, data warehouses, and report automation solutions. 7. Support to Management Assist senior management with ad-hoc analysis, strategic reviews, and Board-level presentations. Provide MIS support for product planning, regulatory inspections, audits, and business strategy. Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Masjid, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: Management Information Systems: 10 years (Preferred) Work Location: In person

Posted 4 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

TCS Hiring for Azure FinOps Experience: 8 to 10 Years Only Job Location: Kolkata/Pune TCS Hiring for Azure FinOps Required Technical Skill Set: As a Cloud FinOps consultant, you are responsible for developing and implementing a robust program for cloud cost management that includes: a service-based cost allocation and classification strategy, tracking and management of cloud cost, cloud services rates setting, and defining consumption and show back/chargeback reports from both the provider and consumer perspective. Experienced Cloud FinOps Practitioner who can work on mainly Azure along with AWS platforms for Cloud Optimization & Cost savings (Certification as a FinOps Practitioner will have an added advantage) Create cloud cost optimization framework and Governance mechanism. Who takes ownership in Cost Analysis, Reviewing Recommendations, creating budgeting alerts, purchasing reservations / savings plan, tagging, anomaly detection and forecasting spend. Experienced in rightsizing of computer services, identifying unused resources and AHUB implementation. Define and setup the Cloud Spend governance process and Cloud Spend Tracking mechanism. Experience in driving deep architectural discussions to help customers ensure they are making the most cost-efficient cloud usage choices. Interacting with vendors on Enterprise Agreements, MCA, Discounts and various other optimization opportunities. Creating dashboards for visualizing monthly, yearly cost by adding various filters. Drive a close working relationships with IT teams, Finance teams, Architects, Operations Knowledge of IaaS, PaaS services and cloud technologies, for example Databricks, AKS, ADF, Log Analytics, Load Balancing, Disaster recovery. Knowledge of app & data architectures and cloud native patterns for development, etc. Kind Regards, Priyankha M

Posted 4 weeks ago

Apply

5.0 - 9.0 years

3 - 9 Lacs

Chennai

On-site

Technical Lead Chennai 5-9 Years INDIA Job Family Practice (Digital) Job Description (Posting). To be responsible for managing technology in projects and providing technical guidance / solutions for work completion. (1.) To develop and guide the team members in enhancing their technical capabilities and increasing productivity (2.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. (3.) To be responsible for providing technical guidance / solutions ;define, advocate, and implement best practices and coding standards for the team. (4.) To ensure process compliance in the assigned module| and participate in technical discussions/review as a technical consultant for feasibility study (technical alternatives, best packages, supporting architecture best practices, technical risks, breakdown into components, estimations). Qualification B-Tech No. of Positions 1 Skill (Primary) Oracle (APPS)-Oracle E- Business Suite Technical-ADF Auto req ID 1541137BR Skill Level 3 (Secondary Skill 1) Technical Skills (APPS)-Datawarehouse-Extract Transform Load (ETL) Automation

Posted 4 weeks ago

Apply

5.0 years

15 - 24 Lacs

Bengaluru

On-site

Job Title: Senior Data Engineer – Azure | ADF | Databricks | PySpark | AWS Location: Bangalore, Hyderabad, Chennai (Hybrid Mode) Experience Required: 5+ Years Notice Period: Immediate Job Description We are looking for a Senior Data Engineer who is passionate about designing and developing scalable data pipelines, optimizing data architecture, and working with advanced big data tools and cloud platforms. This is a great opportunity to be a key player in transforming data into meaningful insights by leveraging modern data engineering practices on Azure, AWS, and Databricks . You will be working with cross-functional teams including data scientists, analysts, and software engineers to deliver robust data solutions. The ideal candidate will be technically strong in Azure Data Factory, PySpark, Databricks, and AWS services , and will have experience in building end-to-end ETL workflows and driving business impact through data. Key Responsibilities Design, build, and maintain scalable and reliable data pipelines and ETL workflows Implement data ingestion and transformation using Azure Data Factory (ADF) and Azure Databricks (PySpark) Work across multiple data platforms including Azure, AWS, Snowflake, and Redshift Collaborate with data scientists and business teams to understand data needs and deliver solutions Optimize data storage, processing, and retrieval for performance and cost-effectiveness Develop data quality checks and monitoring frameworks for pipeline health Ensure data governance, security, and compliance with industry standards Lead code reviews, set data engineering standards, and mentor junior team members Propose and evaluate new tools and technologies for continuous improvement Must-Have Skills Strong programming skills in Python , SQL , or Scala Azure Data Factory , Azure Databricks , Synapse Analytics Hands-on with PySpark , Spark, Hadoop, Hive Experience with cloud platforms (Azure preferred; AWS/GCP acceptable) Data Warehousing: Snowflake , Redshift , BigQuery Strong ETL/ELT pipeline development experience Workflow orchestration tools such as Airflow , Prefect , or Luigi Excellent problem-solving, debugging, and communication skills Nice to Have Experience with real-time streaming tools (Kafka, Flink, Spark Streaming) Exposure to data governance tools and regulations (GDPR, HIPAA) Familiarity with ML model integration into data pipelines Containerization and CI/CD exposure: Docker, Git, Kubernetes (basic) Experience with Vector databases and unstructured data handling Technical Environment Programming: Python, Scala, SQL Big Data Tools: Spark, Hadoop, Hive Cloud Platforms: Azure (ADF, Databricks, Synapse), AWS (S3, Glue, Lambda), GCP Data Warehousing: Snowflake, Redshift, BigQuery Databases: PostgreSQL, MySQL, MongoDB, Cassandra Orchestration: Apache Airflow, Prefect, Luigi Tools: Git, Docker, Azure DevOps, CI/CD pipelines Soft Skills Strong analytical thinking and problem-solving abilities Excellent verbal and written communication Collaborative team player with leadership qualities Self-motivated, organized, and able to manage multiple projects Education & Certifications Bachelor’s or Master’s Degree in Computer Science, IT, Engineering, or equivalent Cloud certifications (e.g., Microsoft Azure Data Engineer, AWS Big Data) are a plus Key Result Areas (KRAs) Timely delivery of high-performance data pipelines Quality of data integration and governance compliance Business team satisfaction and data readiness Proactive optimization of data processing workloads Key Performance Indicators (KPIs) Pipeline uptime and performance metrics Reduction in overall data latency Zero critical issues in production post-release Stakeholder satisfaction score Number of successful integrations and migrations Job Types: Full-time, Permanent Pay: ₹1,559,694.89 - ₹2,441,151.11 per year Benefits: Provident Fund Schedule: Day shift Supplemental Pay: Performance bonus Application Question(s): What is your notice period in days? Experience: Azure Data Factory, Azure Databricks, Synapse Analytics: 5 years (Required) Python, SQL, or Scala: 5 years (Required) Work Location: In person

Posted 4 weeks ago

Apply

7.0 - 12.0 years

20 - 27 Lacs

Bengaluru

Work from Office

TECHNICAL SKILLS AND EXPERIENCE Most important: 7+ years professional experience as a data engineer, with at least 4 utilizing cloud technologies. Proven experience building ETL or ETL data pipelines with Databricks either in Azure or AWS using PySpark language. Strong experience with the Microsoft Azure Data Stack (Databricks, Data Lake Gen2, ADF etc.) Strong SQL skills and proficiency in Python adhering to standards such as PEP Proven experience with unit testing and applying appropriate testing methodologies using libraries such as Pytest, Great Expectations, or similar. Demonstrable experience with CICD including release and test automation tools and processes such as Azure Devops, Terraform, Powershell and Bash scripting or similar. Strong understanding of data modeling, data warehousing, and OLAP concepts. Excellent technical documentation skills. Preferred candidate profile

Posted 4 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Tech skills Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL Strong background in distributed data processing and storage (e.g. Apache Spark, Hadoop) Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines Development experience with at least one cloud (Azure high preference, AWS, GCP) Knowledge of data lake and data lake house patterns Knowledge of ETL performance tuning and cost optimization Knowledge of data structures and algorithms and good software engineering practices Soft skills Strong communication skills to articulate complex situation concisely Comfortable with picking up new technologies independently Eye for detail, good data intuition, and a passion for data quality Comfortable working in a rapidly changing environment with ambiguous requirements Skills Python,Sql,Aws,Azure

Posted 4 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Location: Hyderabad, Indore and Ahmedabad (India) What You Will Do: Following are high level responsibilities that you will play but not limited to: · Develop and maintain data pipelines using Azure Data Factory (ADF) /Databricks for data integration and ETL processes. · Design, implement, and optimize Power BI /Fabric reports and dashboards to deliver actionable business insights. · Collaborate with data analysts, business users, and other teams to understand data requirements and deliver solutions using ADF and Power BI. · Extract, transform, and load (ETL) data from various sources into cloud-based storage systems such as Azure Data Lake or Azure SQL Database. · Work with large datasets and optimize queries and pipelines for performance and scalability. · Ensure data quality, integrity, and availability throughout the data lifecycle. · Automate repetitive data tasks, ensuring timely and accurate reporting. · Monitor and troubleshoot data pipelines, addressing any performance or data issues promptly. · Support data visualization and reporting tools, including Power BI, to enable business stakeholders to make data-driven decisions. · Write clear, efficient, and maintainable code for data transformations and automation. Required Qualifications: · Bachelor's degree in computer science, Information Technology, Engineering, or a related field. · 8+ years of hands-on experience in Data Engineering, BI Developer or a similar role. · Proficiency with Azure Data Factory (ADF), including the creation of data pipelines and managing data flows. · Strong experience in Power BI, including report creation, dashboard development, and data modeling. · Experience with SQL and database management (e.g., Azure SQL Database, SQL Server). · Knowledge of cloud platforms, especially Microsoft Azure. · Familiarity with data warehousing concepts and ETL processes. · Experience working with cloud-based data storage solutions (e.g., Azure Data Lake, Azure Blob Storage). · Strong programming skills in languages such as Python, SQL, or other relevant languages. · Ability to troubleshoot and optimize data pipelines for performance and reliability. Preferred Qualifications: · Familiarity with data modeling techniques and practices for Power BI. · Knowledge of Azure Databricks or other data processing frameworks. · Knowledge of Microsoft Fabric or other Cloud Platforms. What we need? · B. Tech computer science or equivalent. Why join us? · Work with a passionate and innovative team in a fast-paced, growth-oriented environment. · Gain hands-on experience in content marketing with exposure to real-world projects. · Opportunity to learn from experienced professionals and enhance your marketing skills. · Contribute to exciting initiatives and make an impact from day one. · Competitive stipend and potential for growth within the company. · Recognized for excellence in data and AI solutions with industry awards and accolades.

Posted 4 weeks ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Role -: Azure Integration Engineer Exp -: 5 to 8 yrs Location -: Indore Immediate Joiners preferred Must Have-: Proficiency in Azure Logic Apps, Azure API Management, Azure Service Bus, Azure Event Grid, ADF, C#.NET and Azure Functions. Experience with JSON, XML, and other data format Working experience with Azure DevOps and GitHub Knowledge of Integration monitoring and lifecycle management Roles & Responsibilities -: Designing, developing, and deploying integration workflows using Azure Logic Apps. Creating and managing APIs using Azure API Management. Developing event-driven solutions with Azure Event Grid and Azure Service Bus. Building serverless functions with Azure Functions to support integration logic. Developing data transformations and mappings. Implementing integration patterns such as API integration, message queuing, and event-driven architecture. Working with different data formats (e.g., JSON, XML) & protocols (SOAP, REST etc) Perform UT & Help with integration testing. Support UAT

Posted 4 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Greetings from TCS!!! TCS is hiring for Azure Data Engineer Exp: 8-10 years Location: Kolkata/Pune/Mumbai/Bangalore Must-Have Strong experience in Azure Data Factory, ADB (Azure Databricks) Synapse, pyspark; establishing the cloud connectivity between different system like ADLS, ADF, Synapse, Databricks etc. A minimum of 7 years' experience with large SQL data marts. Expert relational database experience, Candidate should demonstrate ability to navigate through massive volumes of data to deliver effective and efficient data extraction, design, load, and reporting solutions to business partners, Minimum 7 years of troubleshooting and supporting large databases and testing activities; Identifying reporting, and managing database security issues, user access/management; Designing database backup, archiving and storage, performance tuning, ETL importing large volume of data extracted from multiple systems, capacity planning Experience in TSQL programming along with Azure Data Factory framework and Python scripting Work well independently as well as within a team Proactive, organized, excellent analytical and problem-solving skills Flexible and willing to learn, can-do attitude is key Strong verbal and written communication skills Good-to-Have Financial institution data mart experience is an asset. Experience in .net application is an asset Experience and expertise in Tableau driven dashboard design is an asset Responsibility of / Expectations from the Role Azure Data Engineer (ADF,ADB) ETL processes using frameworks like Azure Data Factory or Synapse or Databricks; Establishing the cloud connectivity between different system like ADLS ,ADF, Synapse, Databricks etc TSQL programming along with Azure Data Factory framework and Python scripting

Posted 4 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Lead Technical Architect (Strategy & Optimization Data Lake & Analytics) Responsibilities: · Manage Project Delivery, scope, timelines, budget, resource allocation, and risk mitigation. · Develop and maintain robust data ingestion pipelines (batch, streaming, API). Provide architectural inputs during incident escalations and act as final authority for RCA documentation and closure. of ADF, Power BI, and Databricks · Define and enforce data governance, metadata, and quality standards across zones. · Monitor performance, optimize data formats (e.g., Parquet), and tune for cost-efficiency. Tune query performance for Databricks and Power BI datasets using optimization techniques (e.g. caching, BI Engine, materialized views). · Lead and mentor a team of data engineers, fostering skills in Azure services and DevOps. Guide schema designs for new datasets and integrations aligned with Diageo’s analytics strategy. · Coordinate cross-functional stakeholders (security, DevOps, business) for aligned execution. · Oversee incident and change management with SLA adherence and continuous improvement. Serve as the governance owner for SLA compliance, IAM policies, encryption standards, and data retention strategies. · Ensure compliance with policies (RBAC, ACLs, encryption) and regulatory audits. Initial data collection for RCA · Report project status, KPIs, and business value to senior leadership. Lead monthly and quarterly reviews, presenting insights, improvements, and roadmap alignment to Diageo stakeholders. Required Skills · Strong architecture-level expertise in Azure Data Platform (ADLS, ADF, Databricks, Synapse, Power BI). · Deep understanding of data lake zone structuring, data lineage, metadata governance, and compliance (e.g., GDPR, ISO). · Expert in Spark, PySpark, SQL, JSON, and automation tooling (ARM, Bicep, Terraform optional). · Capable of aligning technical designs with business KPIs and change control frameworks. · Excellent stakeholder communication, team mentoring, and leadership capabilities.

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

India

Remote

Job Title: Senior BI Developer (Microsoft BI Stack) Location: REMOTE Experience: 5+ years Employment Type: Full-Time Job Summary We are looking for an experienced Senior BI Developer with strong expertise in the Microsoft BI Stack (SSIS, SSRS, SSAS) to join our dynamic team. The ideal candidate will design and develop scalable BI solutions and contribute to strategic decision-making through efficient data modeling, ETL processes, and insightful reporting. Key Responsibilities Design and develop ETL packages using SSIS for data extraction, transformation, and loading from diverse sources. Create and maintain dashboards and reports using SSRS and Power BI (if applicable). Implement and manage OLAP cubes and data models using SSAS (Multidimensional/Tabular). Develop and optimize complex T-SQL queries, stored procedures, and functions. Work closely with business analysts, data engineers, and stakeholders to gather requirements and translate them into technical solutions. Optimize BI solutions for performance and scalability. Lead BI architecture improvements and ensure efficient data flow. Ensure data quality, integrity, and consistency across systems. Mentor and support junior BI developers as needed. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, or a related field. 5+ years of hands-on experience with the Microsoft BI Stack: SSIS, SSRS, SSAS. Strong knowledge of SQL Server (2016 or later) and advanced T-SQL. Deep understanding of data warehousing concepts, including star/snowflake schemas, and fact/dimension models. Experience with Power BI is a plus. Exposure to Azure Data Services (ADF, Azure SQL, Synapse) is an added advantage. Strong analytical, troubleshooting, and problem-solving skills. Excellent verbal and written communication skills. Why Join Us? Opportunity to work on enterprise-scale BI projects. Supportive work environment with career growth potential. Exposure to modern BI tools and cloud technologies. Skills: azure,power bi,ssis,t-sql,ssrs,data,ssas,sql,sql server,azure data services

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Information Technology expert with 5+ years of banking domain knowledge. Banking application development and implementation experience across offshore & onshore model including but not limited to Solution Design, Development and Support activities. 4+ years of Application Design and Development experience with Oracle Banking Platform (OBP) product for Lending & Deposit products, Origination Workflow for online as well as batch solutions & integrations. Hands on experience on Java, J2EE, ADF, SOA, OSB, Oracle Fusion & OBP technologies. Working experience with different SDLC phases from Analysis, design, development, and production support using Agile Methodology. 3+ years of experience with Automation testing tools like Selenium Automation testing using tools like Selenium Building Framework components and Business Process Patterns Closely work with Technical Solution Architect and process Functional Lead Technologies: JAVA, J2EE, Oracle BPM, SOA, JBoss BPM, API, Microservices Key Contribution in Solution analysis & redesign of application components in stabilizing OBP platform mainly for Oracle BPM, Oracle SOA & Oracle ADF technology components. Solution Design and Implementation for Application retrofits and migration to new Oracle hardware Key Bug-fixes and solution design and review with Oracle Banking Platform product teams. Automation design in various solution components within the Involved in the implementation of Oracle BPM, OBP Host, OBP UI solution for OBP Platform. Application support and enhancements - Production issue root cause analysis and solution design for the support fixes and enhancements Multi environment deployment and testing co-ordination for live system changes At least 5+ years of banking domain knowledge. What’s in it for you? We are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our world: Our people, our culture, our voices, and our passions. What’s better than building the next big thing? It’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purpose: to navigate further, together. EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin At Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch. Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department.

Posted 4 weeks ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Must have: Primary Skills : 8+ years of hands-on development experience with: • C#, .NET Core 6/8+, Entity Framework / EF Core • JavaScript, jQuery, REST APIs • Expertise in MS SQL Server, including: • Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types • Skilled in unit testing with XUnit, MSTest • Strong in software design patterns, system architecture, and scalable solution design • Ability to lead and inspire teams through clear communication, technical mentorship, and ownership 2+ years of hands with Azure Cloud Services, including: • Azure Functions • Azure Durable Functions • Azure Service Bus, Event Grid, Storage Queues • Blob Storage, Azure Key Vault, SQL Azure • Application Insights, Azure Monitoring Nice to have: • Familiarity with AngularJS, ReactJS, and other front-end frameworks • Experience with Azure API Management (APIM) • Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps • Exposure to Application Support and operational monitoring • Azure DevOps - CI/CD pipelines (Classic / YAML) Working hours- Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours

Posted 4 weeks ago

Apply

0.0 - 10.0 years

0 Lacs

Masjid, Mumbai, Maharashtra

On-site

Bombay Mercantile Co-Operative Bank Ltd., a leading Multi-State Scheduled Bank, with 52 branches across 10 states, requires dynamic and experience personnel. Age: 45-50 Years Location: Mumbai Qualification and Experience: Graduate/Postgraduate in Computer Science, Information Systems, Data Analytics, Statistics, or a related field. Experience with BI tools such as Tableau, Power BI, or similar is an added advantage. Minimum 10–15 years of relevant experience in MIS, with at least 5 years in a leadership role, preferably in a cooperative or public sector bank. Knowledge of CBS systems, RBI reporting portals, data analytics tools, and SQL/database management essential. Key Responsibilities: 1. MIS Strategy & Planning Develop and implement an effective MIS framework to support strategic and operational objectives. Ensure integration of MIS with Core Banking System (CBS), Loan Origination System (LOS), and other internal systems for seamless data flow. 2. Data Collection, Processing & Reporting Design, standardize, and maintain reporting formats for daily, weekly, monthly, and quarterly reporting across departments. Ensure timely generation of reports for internal management, Board of Directors, auditors, and regulators. Prepare and submit statutory and compliance reports to RBI, NABARD, State Registrar, etc. 3. Regulatory & Compliance Reporting Ensure all RBI-mandated MIS submissions (e.g., CRILC, XBRL, returns under ADF, etc.) are accurate and timely. Track regulatory changes and incorporate them into reporting frameworks. 4. Performance & Operational Dashboards Develop real-time dashboards and KPIs for key functions such as credit, deposits, NPA tracking, branch performance, etc. Provide analytics support to business heads for performance analysis and forecasting. 5. Data Governance & Quality Maintain high standards of data integrity, consistency, and security across systems. Conduct regular audits and validations of MIS data to identify and correct discrepancies. 6. System Enhancement & Automation Liaise with IT department and software vendors to automate recurring reports. Support implementation of business intelligence (BI) tools, data warehouses, and report automation solutions. 7. Support to Management Assist senior management with ad-hoc analysis, strategic reviews, and Board-level presentations. Provide MIS support for product planning, regulatory inspections, audits, and business strategy. Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Masjid, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: Management Information Systems: 10 years (Preferred) Work Location: In person

Posted 4 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Information Technology expert with 5+ years of banking domain knowledge. Banking application development and implementation experience across offshore & onshore model including but not limited to Solution Design, Development and Support activities. 4+ years of Application Design and Development experience with Oracle Banking Platform (OBP) product for Lending & Deposit products, Origination Workflow for online as well as batch solutions & integrations. Hands on experience on Java, J2EE, ADF, SOA, OSB, Oracle Fusion & OBP technologies. Working experience with different SDLC phases from Analysis, design, development, and production support using Agile Methodology. 3+ years of experience with Automation testing tools like Selenium Automation testing using tools like Selenium Building Framework components and Business Process Patterns Closely work with Technical Solution Architect and process Functional Lead Technologies: JAVA, J2EE, Oracle BPM, SOA, JBoss BPM, API, Microservices Key Contribution in Solution analysis & redesign of application components in stabilizing OBP platform mainly for Oracle BPM, Oracle SOA & Oracle ADF technology components. Solution Design and Implementation for Application retrofits and migration to new Oracle hardware Key Bug-fixes and solution design and review with Oracle Banking Platform product teams. Automation design in various solution components within the Involved in the implementation of Oracle BPM, OBP Host, OBP UI solution for OBP Platform. Application support and enhancements - Production issue root cause analysis and solution design for the support fixes and enhancements Multi environment deployment and testing co-ordination for live system changes At least 5+ years of banking domain knowledge. What’s in it for you? We are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our world: Our people, our culture, our voices, and our passions. What’s better than building the next big thing? It’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purpose: to navigate further, together. EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin At Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch. Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department.

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies