Home
Jobs

984 Adf Jobs - Page 6

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 10.0 years

0 Lacs

Masjid, Mumbai, Maharashtra

On-site

Indeed logo

Bombay Mercantile Co-Operative Bank Ltd., a leading Multi-State Scheduled Bank, with 52 branches across 10 states, requires dynamic and experience personnel. Age: 45-50 Years Location: Mumbai Qualification and Experience: Graduate/Postgraduate in Computer Science, Information Systems, Data Analytics, Statistics, or a related field. Experience with BI tools such as Tableau, Power BI, or similar is an added advantage. Minimum 10–15 years of relevant experience in MIS, with at least 5 years in a leadership role, preferably in a cooperative or public sector bank. Knowledge of CBS systems, RBI reporting portals, data analytics tools, and SQL/database management essential. Key Responsibilities: 1. MIS Strategy & Planning Develop and implement an effective MIS framework to support strategic and operational objectives. Ensure integration of MIS with Core Banking System (CBS), Loan Origination System (LOS), and other internal systems for seamless data flow. 2. Data Collection, Processing & Reporting Design, standardize, and maintain reporting formats for daily, weekly, monthly, and quarterly reporting across departments. Ensure timely generation of reports for internal management, Board of Directors, auditors, and regulators. Prepare and submit statutory and compliance reports to RBI, NABARD, State Registrar, etc. 3. Regulatory & Compliance Reporting Ensure all RBI-mandated MIS submissions (e.g., CRILC, XBRL, returns under ADF, etc.) are accurate and timely. Track regulatory changes and incorporate them into reporting frameworks. 4. Performance & Operational Dashboards Develop real-time dashboards and KPIs for key functions such as credit, deposits, NPA tracking, branch performance, etc. Provide analytics support to business heads for performance analysis and forecasting. 5. Data Governance & Quality Maintain high standards of data integrity, consistency, and security across systems. Conduct regular audits and validations of MIS data to identify and correct discrepancies. 6. System Enhancement & Automation Liaise with IT department and software vendors to automate recurring reports. Support implementation of business intelligence (BI) tools, data warehouses, and report automation solutions. 7. Support to Management Assist senior management with ad-hoc analysis, strategic reviews, and Board-level presentations. Provide MIS support for product planning, regulatory inspections, audits, and business strategy. Job Type: Full-time Schedule: Day shift Ability to commute/relocate: Masjid, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: Management Information Systems: 10 years (Preferred) Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Information Technology expert with 5+ years of banking domain knowledge. Banking application development and implementation experience across offshore & onshore model including but not limited to Solution Design, Development and Support activities. 4+ years of Application Design and Development experience with Oracle Banking Platform (OBP) product for Lending & Deposit products, Origination Workflow for online as well as batch solutions & integrations. Hands on experience on Java, J2EE, ADF, SOA, OSB, Oracle Fusion & OBP technologies. Working experience with different SDLC phases from Analysis, design, development, and production support using Agile Methodology. 3+ years of experience with Automation testing tools like Selenium Automation testing using tools like Selenium Building Framework components and Business Process Patterns Closely work with Technical Solution Architect and process Functional Lead Technologies: JAVA, J2EE, Oracle BPM, SOA, JBoss BPM, API, Microservices Key Contribution in Solution analysis & redesign of application components in stabilizing OBP platform mainly for Oracle BPM, Oracle SOA & Oracle ADF technology components. Solution Design and Implementation for Application retrofits and migration to new Oracle hardware Key Bug-fixes and solution design and review with Oracle Banking Platform product teams. Automation design in various solution components within the Involved in the implementation of Oracle BPM, OBP Host, OBP UI solution for OBP Platform. Application support and enhancements - Production issue root cause analysis and solution design for the support fixes and enhancements Multi environment deployment and testing co-ordination for live system changes At least 5+ years of banking domain knowledge. What’s in it for you? We are not just a technology company full of people, we’re a people company full of technology. It is people like you who make us what we are today. Welcome to our world: Our people, our culture, our voices, and our passions. What’s better than building the next big thing? It’s doing so while never letting go of the little things that matter. None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment in which to do them, one where ideas can flourish, and where you are empowered to move forward as far as your ideas will take you. This is something we achieve through cultivating a culture of inclusiveness and openness, and a mindset of exploration and applied innovation. A career at Infosys means experiencing and contributing to this environment every day. It means being a part of a dynamic culture where we are united by a common purpose: to navigate further, together. EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin At Infosys, we recognize that everyone has individual requirements. If you are a person with disability, illness or injury and require adjustments to the recruitment and selection process, please contact our Recruitment team for adjustment only on Infosys_ta@infosys.com or include your preferred method of communication in email and someone will be in touch. Please note in order to protect the interest of all parties involved in the recruitment process, Infosys does not accept any unsolicited resumes from third party vendors. In the absence of a signed agreement any submission will be deemed as non-binding and Infosys explicitly reserves the right to pursue and hire the submitted profile. All recruitment activity must be coordinated through the Talent Acquisition department.

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Avensys is a reputed global IT professional services company headquartered in Singapore. Our service spectrum includes enterprise solution consulting, business intelligence, business process automation and managed services. Given our decade of success, we have evolved to become one of the top trusted providers in Singapore and service a client base across banking and financial services, insurance, information technology, healthcare, retail and supply chain. * Singapore Onsite *looking for only short term notice candidates Job Description: We are looking for an experienced Oracle IAM Consultant expertise to join our dynamic team Design and Implementation: Designing and implementing IAM solutions using OIM /OIG Developing custom connectors to integrate OIM with various applications and systems. Building and configuring OIM workflows, approval policies, and entitlements. Developing custom UI components for OIM self-service pages. Skills and Experience: Experienced in an end-to-end integration of IAM Solution using Oracle Identity governance. Prior experience with requirement gathering, analysis, design, development, maintenance, and upgrades in different environments like DEV, QA, UAT, PROD. Experience with ICF Based Framework connector to integrate with target applications and perform CRUD Operations and managing roles to the Target system. Extended hands-on experience with custom code development such as Event Handlers, Validation Plugin and Schedule Tasks using Java API. Experience with Audit reports with OIM BI Publisher and customized the logo and header of the UI Screen and audit reports. Implement Oracle ADF customizations for user interfaces. Build custom Oracle SOA composites for workflows. Java Experience: Best practice based secure java development Exposure and hands on experience with REST APIs and web services Ability to re-use existing code and extend frameworks Administration and Management: Administering and managing OIM environments. Ensuring the IAM platform is secure, scalable, and supports business requirements. Monitoring the performance and health of IAM systems. Security and Compliance: Developing and enforcing IAM policies and procedures. Collaborating with security teams to address vulnerabilities. Support and Troubleshooting: Supporting end-users with access-related issues and requests. Troubleshooting and resolving technical issues related to OIM implementation. Good to Have: Hands-on experience with Oracle Access manager Good understanding of AS400 and relevant infrastructure Unix Scripting String SQL knowledge WHAT’S ON OFFER You will be remunerated with an excellent base salary and entitled to attractive company benefits. Additionally, you will get the opportunity to enjoy a fun and collaborative work environment, alongside a strong career progression. To submit your application, please apply online or email your UPDATED CV to swathi@aven-sys.com Your interest will be treated with strict confidentiality. CONSULTANT DETAILS: Consultant Name : Swathi Avensys Consulting Pte Ltd EA Licence 12C5759 Privacy Statement: Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and Avensys' privacy policy

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title TECHNICAL ANALYST Job Description Job Title: ADF Data Engineer Responsibilities: Experience: 5 to 10 Yrs of Experience. • Convert Workato recipes into Azure Data Factory (ADF) pipelines to facilitate seamless data integration. • Design, develop, and maintain ADF pipelines to connect and orchestrate data flow between Snowflake and Salesforce. • Collaborate with cross-functional teams to understand data requirements and ensure efficient data integration. • Optimize data pipelines for performance, scalability, and reliability. • Implement data quality checks and monitoring to ensure data accuracy and consistency. • Troubleshoot and resolve issues related to data integration and pipeline performance. • Document data integration processes and maintain up-to-date technical documentation. Qualifications: • Bachelor's degree in Computer Science, Information Technology, or a related field. • Proven experience as a Data Engineer, with a focus on Azure Data Factory, data pipelines and data integration. • Strong proficiency in SQL and experience working with Snowflake and Salesforce. • Knowledge of ETL/ELT processes and best practices. • Familiarity with data warehousing concepts and cloud-based data solutions. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills. Preferred Qualifications: • Experience with other Azure services such as Azure Data Lake, Azure Synapse Analytics, and Azure Functions. • Certification in Azure Data Engineering or related fields. • Experience with version control systems like Git. • Experience with Workato or similar integration platforms.

Posted 1 week ago

Apply

8.0 years

2 - 4 Lacs

Bengaluru

On-site

GlassDoor logo

If you are looking for a challenging and exciting career in the world of technology, then look no further. Skyworks is an innovator of high performance analog semiconductors whose solutions are powering the wireless networking revolution. At Skyworks, you will find a fast-paced environment with a strong focus on global collaboration, minimal layers of management and the freedom to make meaningful contributions in a setting that encourages creativity and out-of-the-box thinking. We are excited about the opportunity to work with you and glad you want to be part of a team of talented individuals who together can change the way the world communicates. Requisition ID: 75243 Header If you are looking for a challenging and exciting career in the world of technology, then look no further. Skyworks is an innovator of high performance analog semiconductors whose solutions are powering the wireless networking revolution. At Skyworks, you will find a fast-paced environment with a strong focus on global collaboration, minimal layers of management and the freedom to make meaningful contributions in a setting that encourages creativity and out-of-the-box thinking. Our work culture values diversity, social responsibility, open communication, mutual trust and respect. We are excited about the opportunity to work with you and glad you want to be part of a team of talented individuals who together can change the way the world communicates. Requisition ID: 75243 Description We are seeking a highly skilled and experienced Sr. Principal Enterprise Integration Architect to join our team. The ideal candidate will have a strong background in enterprise integration architecture and extensive experience working with global teams and people. This role is critical in ensuring that all applications company-wide are managed into a world-class portfolio. The Sr. Principal Enterprise Integration Architect will play a pivotal role in designing, architecting, developing, and supporting integration solutions globally. Responsibilities Lead the design and implementation of enterprise integration solutions using Azure iPaaS or similar middleware tools. Collaborate with global teams to ensure seamless integration of applications across the company. Develop and maintain integration architecture standards and best practices. Manage the integration portfolio, ensuring all applications are aligned with the company's strategic goals. Provide technical leadership and guidance to the integration team. Oversee the development and support of integration solutions, ensuring high availability and performance. Conduct regular reviews and assessments of integration solutions to identify areas for improvement. Work closely with stakeholders to understand business requirements and translate them into technical solutions. Ensure compliance with security and regulatory requirements in all integration solutions. Required Experience and Skills Minimum of 8 years of experience in enterprise architecture, integration, software development or a related field. Exposure to native cloud platforms such as Azure, AWS, or GCP and experience with them at scale. Integration and Data Feeds: Support and maintain existing integration and data feeds, ensuring seamless data flow and system integration. Azure iPaaS Development: (or similar middleware product) : Design, develop, and maintain applications using Azure Integration Platform as a Service (iPaaS) components such as Logic Apps, Azure Data Factory (ADF), and Function Apps & APIM SQL and ETL Processes: Develop and optimize SQL queries and manage database systems for ETL processes. DevOps Management: Implement and manage DevOps pipelines using Git, Jenkins, Azure DevOps, and GitHub. Support and maintain Jenkins servers, Azure DevOps, and GitHub. Proven experience working with global teams and managing cross-functional projects. Excellent understanding of integration design principles and best practices. Strong leadership and communication skills. Ability to manage multiple projects and priorities simultaneously. Experience with integration tools and platforms such as API management, ESB, and ETL. Knowledge of security and regulatory requirements related to integration solutions. Strong problem-solving and analytical skills. Desired Experience and Skills Experience in managing large-scale integration projects. Familiarity with other cloud platforms and technologies. Knowledge of DevOps practices and tools. Experience with Agile methodologies. Certification in Azure or related technologies. Strong understanding of business processes and how they relate to integration solutions. Skyworks is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other characteristic protected by law.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle Netsuite to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: 1. Lead a team of NetSuite developers, providing guidance, mentorship, and technical expertise to ensure high-quality deliverables and project success. 2. Define technical architecture and design standards for NetSuite solutions, ensuring scalability, performance, and maintainability. 3. Stay updated on emerging technologies and best practices in NetSuite development, driving innovation and continuous improvement within the team. 4. Manage end-to-end technical projects for NetSuite implementations, upgrades, and customizations, ensuring adherence to scope, budget, and timeline. 5. Develop project plans, resource allocation strategies, and risk mitigation plans, and monitor project progress to identify and address issues proactively. 6. Lead the development and customization of NetSuite solutions, including Suite Scripting, Suite Flow, Suite Builder, and Suite Cloud development. 7. Collaborate with functional consultants to translate business requirements into technical solutions, ensuring alignment with best practices and industry standards. 8. Serve as a technical liaison between the development team and clients, providing technical expertise, addressing concerns, and managing expectations. 9. Participate in client meetings and workshops to understand their technical requirements, propose solutions, and provide updates on project status. o Mandatory Skill Sets: Netsuite *Preferred skill sets Netsuite Qualifications: 1. Bachelor’s degree in computer science, Information Technology, or related field. 2. 2years of hands-on experience in NetSuite development, customization, and integration. 3 Expertise in NetSuite Suite Script, Suite Flow, Suite Builder, and Suite Cloud development platforms. 5. NetSuite certifications such as Suite Foundation, Suite Cloud Developer, or Suite Commerce Advanced are highly desirable. *Years of experience required • Minimum 2Years of Netsuite expert *Education Qualification • Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Netsuite Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage, Relationship Building, Self-Awareness {+ 4 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

4.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

FasCave IT Solutions Pvt. Ltd. is looking for a Data Engineer to join our team on a 10-12 months contract basis for one of the best healthcare companies as a client. If you have expertise in data modeling and engineering, this opportunity is for you! Position: Data Engineer Location: Remote Duration: 10-12 Months (Contract) Experience: 4-10 Years Shift Time: Australian Shift (5 AM TO 1 PM IST) Key Requirements: Strong SQL skills Snowflake Azure Data Factory (ADF) Power BI SSIS (Nice to have) 📩 How to Apply? Send your resume to hrd@fascave.com with the following details: - Years of Experience - Current CTC - Expected CTC - Earliest Joining Date

Posted 1 week ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role: Senior Data Engineer Experience: 10+ years Location: Bangalore | Gurgaon Notice Period: Immediate Joiners Only Job Description – Data Engineer (Azure, ADF, Databricks, PySpark, SCD, Unity Catalog, SQL) Required Skills & Qualifications: 6+ years of experience in Data Engineering with a focus on Azure technologies. Expertise in Azure Data Factory (ADF) & Azure Databricks for ETL/ELT workflows. Strong knowledge of Delta Tables & Unity Catalog for efficient data storage and management. Experience with Slowly Changing Dimensions (SCD2) implementation in Delta Lake. Proficiency in PySpark for large-scale data processing & transformation. Hands-on experience with SQL & performance tuning for data pipelines. Understanding of data governance, security, and compliance best practices in Azure. Knowledge of CI/CD, DevOps practices for data pipeline automation. Preferred Qualifications: Experience with Azure Synapse Analytics, Data Lakes, and Power BI integration . Knowledge of Kafka or Event Hub for real-time data ingestion. Certifications in Microsoft Azure (DP-203, DP-900) or Databricks are a plus.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a great Software Development Engineer do? As Software Development Engineer your focus will be on applying the principles of engineering to software development. The role focuses on the complex and large software systems that make up the core systems for the organization. You will be responsible for developing, unit testing, and integration tasks working within this highly visible-client focused web services application. Development efforts will also include feature enhancements, client implementations, and bug fixes as well as support of the production environment. What You Will Do Collaborate within a team environment in the development, testing, and support of software development project lifecycles. Develop web interfaces and underlying business logic. Prepare any necessary technical documentation. Track and report daily and weekly activities. Participate in code reviews and code remediation. Perform and develop proper unit tests and automation. Participate in a 24 hour on-call rotation to support previous releases of the product. Research problems discovered by QA or product support and develop solutions to the problems. Perform additional duties as determined by business needs and as directed by management. What You Will Need To Have Bachelor’s degree in Computer Science, Engineering or Information Technology, or equivalent experience. 3-5 years of experience in developing scalable and secured J2EE applications. Excellent knowledge in Java based technologies (Core Java, JSP, AJAX, JSF, EJB, and Spring Framework), Oracle SQL/PLSQL and App servers like WebLogic, JBOSS. Excellent knowledge in SOAP & REST web service implementations. Knowledge in UNIX environment is preferred. Experience in JSF UI components (Oracle ADF & Rich Faces) technology is preferred. Good analytical, organizational, and problem-solving abilities. Good at prioritizing the tasks and commitment to complete them. Strong team player / customer service orientation. Demonstrated ability to work with both end users and technical staff. Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and Works well in a fast paced, fluid and dynamic environment. What Would Be Great To Have Experience working in a Scrum Development Team Banking and Financial Services experience Java Certifications. Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In This Role, Your Responsibilities May Include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Preferred Education Master's Degree Required Technical And Professional Expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred Technical And Professional Experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Job Information Date Opened 06/19/2025 Job Type Full time Work Experience 5+ years Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500032 Job Description As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.

Posted 1 week ago

Apply

7.0 years

3 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.

Posted 1 week ago

Apply

15.0 years

3 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

What You Will Do: As a Data Governance Architect at Kanerika, you will play a pivotal role in shaping and executing the enterprise data governance strategy. Your responsibilities include: 1. Strategy, Framework, and Governance Operating Model Develop and maintain enterprise-wide data governance strategies, standards, and policies. Align governance practices with business goals like regulatory compliance and analytics readiness. Define roles and responsibilities within the governance operating model. Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement Collaborate across IT, legal, business, and compliance teams to align governance priorities. Define stewardship models and create enablement, training, and communication programs. Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms Design scalable and modular data governance architecture. Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture Lead end-to-end implementation and management of Microsoft Purview. Configure RBAC, collections, metadata scanning, business glossary, and classification rules. Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management Architect metadata repositories and ingestion workflows. Ensure end-to-end lineage (ADF Synapse Power BI). Define governance over business glossary and approval workflows. 6. Data Classification, Access & Policy Management Define and enforce rules for data classification, access, retention, and sharing. Align with GDPR, HIPAA, CCPA, SOX regulations. Use Microsoft Purview and MIP for policy enforcement automation. 7. Data Quality Governance Define KPIs, validation rules, and remediation workflows for enterprise data quality. Design scalable quality frameworks integrated into data pipelines. 8. Compliance, Risk, and Audit Oversight Identify risks and define standards for compliance reporting and audits. Configure usage analytics, alerts, and dashboards for policy enforcement. 9. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance tools with Azure Monitor, Synapse Link, Power BI, and third-party platforms. Requirements 15+ years in data governance and management. Expertise in Microsoft Purview, Informatica, and related platforms. Experience leading end-to-end governance initiatives. Strong understanding of metadata, lineage, policy management, and compliance regulations. Hands-on skills in Azure Data Factory, REST APIs, PowerShell, and governance architecture. Familiar with Agile methodologies and stakeholder communication. Benefits 1. Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.

Posted 1 week ago

Apply

0 years

6 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Analyze, design develop, and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Preferred Qualifications: Oracle Applications Lab (OAL) has a central role within Oracle. It's role is to work with Product Development and Oracle internal business to deliver Oracle products for Oracle to use internally. OAL has a role of implementing Oracle applications, databases and middleware, supporting Oracle applications for Oracle internally and configuring Oracle applications to meet the specific needs of Oracle. OAL also provides a showcase for Oracle's products The role will involve: Working as part of a global team to implement and support new business applications for HR and Payroll Debugging and solving sophisticated problems and working closely with Oracle Product Development and other groups to implement solutions Developing and implementing product extensions and customizations Testing new releases Providing critical production support Your skills should include: Experience in designing and supporting Oracle E-Business Suite and Fusion applications. Preferably Oracle HRMS/Fusion HCM Strong Oracle technical skills: SQL, PL/SQL, Java, XML, ADF, SOA etc Communicating confidently with peers and management within technical and business teams Detailed Description and Job Requirements: Work with Oracle's world class technology to develop, implement, and support Oracle's global infrastructure. As a member of the IT organization, help analyze existing complex programs and formulate logic for new complex internal systems. Prepare flowcharting, perform coding, and test/debug programs. Develop conversion and system implementation plans. Recommend changes to development, maintenance, and system standards. Job duties are varied and complex using independent judgment. May have project lead role. BS or equivalent experience in programming on enterprise or department servers or systems.

Posted 1 week ago

Apply

3.0 - 5.0 years

7 - 10 Lacs

Chennai

On-site

GlassDoor logo

Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10363786 Date posted 06/20/2025 End Date 06/26/2025 City Chennai State/Region Tamil Nadu Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a great Software Development Engineer do? As Software Development Engineer your focus will be on applying the principles of engineering to software development. The role focuses on the complex and large software systems that make up the core systems for the organization. You will be responsible for developing, unit testing, and integration tasks working within this highly visible-client focused web services application. Development efforts will also include feature enhancements, client implementations, and bug fixes as well as support of the production environment. What you will do: Collaborate within a team environment in the development, testing, and support of software development project lifecycles. Develop web interfaces and underlying business logic. Prepare any necessary technical documentation. Track and report daily and weekly activities. Participate in code reviews and code remediation. Perform and develop proper unit tests and automation. Participate in a 24 hour on-call rotation to support previous releases of the product. Research problems discovered by QA or product support and develop solutions to the problems. Perform additional duties as determined by business needs and as directed by management. What you will need to have: Bachelor’s degree in Computer Science, Engineering or Information Technology, or equivalent experience. 3-5 years of experience in developing scalable and secured J2EE applications. Excellent knowledge in Java based technologies (Core Java, JSP, AJAX, JSF, EJB, and Spring Framework), Oracle SQL/PLSQL and App servers like WebLogic, JBOSS. Excellent knowledge in SOAP & REST web service implementations. Knowledge in UNIX environment is preferred. Experience in JSF UI components (Oracle ADF & Rich Faces) technology is preferred. Good analytical, organizational, and problem-solving abilities. Good at prioritizing the tasks and commitment to complete them. Strong team player / customer service orientation. Demonstrated ability to work with both end users and technical staff. Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and Works well in a fast paced, fluid and dynamic environment. What would be great to have: Experience working in a Scrum Development Team Banking and Financial Services experience Java Certifications. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

7.0 years

1 - 9 Lacs

Noida

On-site

GlassDoor logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Functions may include database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning and other similar roles. Responsibilities may include Platform-as-a-Service and Cloud solution with a focus on data stores and associated eco systems. Duties may include management of design services, providing sizing and configuration assistance, ensuring strict data quality, and performing needs assessments. Analyzes current business practices, processes and procedures as well as identifying future business opportunities for leveraging data storage and retrieval system capabilities. Manages relationships with software and hardware vendors to understand the potential architectural impact of different vendor strategies and data acquisition. May design schemas, write SQL or other data markup scripting and helps to support development of Analytics and Applications that build on top of data. Selects, develops and evaluates personnel to ensure the efficient operation of the function. Primary Responsibilities: Participate in scrum process and deliver stories/features according to the schedule Collaborate with team, architects and product stakeholders to understand the scope and design of a deliverable Participate in product support activities as needed by the team. Understand product architecture, features being built and come up with product improvement ideas and POCs Analyzes and investigates Provides explanations and interpretations within area of expertise Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: 7+ years of Implementation experience on time-critical production projects following key software development practices 5+ years of programming experience in Python or any programming language Tools/Technologies: Programming Languages: Python, PySpark Cloud Technologies: Azure (ADF, Databricks, WebApp, Key vault, SQL Server, function app, logic app, Synapse Azure Machine Learning, DevOps) DevOps, implementation of Bigdata, Apache Spark and Azure Cloud Experience: Deep experience in Data Analysis, including source data analysis, data profiling and mapping Good experience in building data pipelines using ADF/Azure Databricks Proven hands-on experience with a large-scale data warehouse Hands-on data migration experience from legacy systems to new solutions, such as from on-premises clusters to Cloud Hands-on programming experience in Spark using scala/python Large scale data processing using PySpark on azure ecosystem Implementation of self-service analytics platform ETL framework using PySpark on Azure Expert skills in Azure data processing tools (Azure Data Factory, Azure Databricks) Solid proficiency in SQL and complex queries Ability to learn and adapt to new data technologies Proven good problem solving skills Proven good communication skills Preferred Qualifications: Knowledge/Experience on Azure Synapse and Power BI Knowledge on US healthcare industry/Pharmacy data At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #Gen

Posted 1 week ago

Apply

2.0 years

0 Lacs

India

On-site

Linkedin logo

Job Description: Full Stack Power BI / Data Analytics Trainer KSR Datavizon Pvt Ltd is hiring a Full Stack Power BI Trainer to lead our Data Analytics training program and mentor future business intelligence professionals. 🔹 Responsibilities: Deliver online or offline training sessions on Power BI, SQL, ADF (Azure Data Factory), and Microsoft Fabric. Teach both visualization and data engineering workflows relevant to end-to-end BI project delivery. Help students master data modeling, DAX, ETL concepts , and dashboard design best practices. Design assessments, case studies, and real-time use cases for practice. Keep training modules aligned with current BI industry needs. 🔹 Required Skills: Strong knowledge of Power BI Desktop and Service , DAX, and Power Query. Proficiency in SQL (joins, CTEs, functions). Hands-on experience with Azure Data Factory (ADF) and Microsoft Fabric (Lakehouse, Pipelines). Experience in building real-time dashboards and publishing Power BI reports. Excellent presentation and communication skills. Previous training or mentorship experience is a plus. 📍 Location : Hyderabad (On-site) 🕒 Type : Part-time (Flexible hours) 🧑‍💼 Experience : 2+ years in Data Analytics / BI or Training Join us to shape the careers of the next-gen Power BI professionals!

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: Data Governance Power BI Specialist Experience: 4 – 6 Years Location: Bangalore, Gurgaon, Pune Notice Period: Immediate to 15 Days Job Purpose: Evaluate the data governance framework and Power BI environment, provide recommendations for enhancing data quality and discoverability, and optimize Power BI performance. Key Responsibilities: Review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes Assess complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB) Define and propose transitions in the use of Azure SQL DW, SQL DB, and Data Lake (DL) Analyze data patterns for optimization, including raw-to-consumption loading and elimination of intermediate zones (e.g., staging/application zones) Understand and implement requirements for external tables (Lakehouse) Ensure the quality of deliverables within project timelines Develop understanding of equity market domain Collaborate with domain experts and stakeholders to define business rules and logic Maintain continuous communication with global stakeholders Troubleshoot complex issues across development, test, UAT, and production environments Coordinate end-to-end project delivery and manage client queries Ensure adherence to SLA/TAT and perform quality checks Work independently as well as collaboratively in cross-functional teams Required Skills and Experience: B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science, or a related field 7+ years of experience in data and cloud architecture working with client stakeholders Strong knowledge of Power BI, Data Governance, Azure Data Factory, Azure Data Lake, Databricks Experience in reviewing PowerShell, SSIS, Batch Scripts, and .NET-based codebases Familiarity with data optimization patterns and architecture transitions in Azure Project management and team leadership experience within agile environments Strong organizational, analytical, and communication skills Ability to deliver high-quality results to internal and external stakeholders

Posted 1 week ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities We are looking for Technical resource for Oracle Apps R12 financial modules-based application. Below will the main responsibilities of the user: Development activity of Oracle R12.2 release Interact with business users and BA/SA to understand the requirements Prepare the technical specification documents Develop the new Interface, conversion and reports Develop/Customize/personalize new/existing oracle form and OAF pages Perform Impact analysis on possible code changes Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s Degree in Computer Science / Engineering 3+ years of Oracle EBS (Technical) experience with R12 release Development experience in the EBS environment in Reports, Interfaces, Conversions, Extensions, Workflow (RICEW) and Forms deliverables Experience in P2P, Oracle General Ledger (GL), Account Payables (AP), Receivables (AR), Cash Management (CM), Sub-ledger Accounting (SLA), and System administrator modules Experience of end-user interaction for requirements gathering, understanding customer needs and working with multiple groups to coordinate and carry out technical activities which include new development, maintenance and production support activities Good knowledge of R12 financial table structure Good knowledge of Agile Methodologies Good hands-on knowledge of SQL, PLSQL, Oracle reports, Oracle form. OAF/ADF, BI publisher reports, Shell scripting and WebServices (Integrated SOA Gateway) Oracle APEX Knowledge Knowledge of WebServices using Integrated SOA Gateway Proven good analytical, performance tuning and debugging skills. At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Functions may include database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning and other similar roles. Responsibilities may include Platform-as-a-Service and Cloud solution with a focus on data stores and associated eco systems. Duties may include management of design services, providing sizing and configuration assistance, ensuring strict data quality, and performing needs assessments. Analyzes current business practices, processes and procedures as well as identifying future business opportunities for leveraging data storage and retrieval system capabilities. Manages relationships with software and hardware vendors to understand the potential architectural impact of different vendor strategies and data acquisition. May design schemas, write SQL or other data markup scripting and helps to support development of Analytics and Applications that build on top of data. Selects, develops and evaluates personnel to ensure the efficient operation of the function. Primary Responsibilities Participate in scrum process and deliver stories/features according to the schedule Collaborate with team, architects and product stakeholders to understand the scope and design of a deliverable Participate in product support activities as needed by the team. Understand product architecture, features being built and come up with product improvement ideas and POCs Analyzes and investigates Provides explanations and interpretations within area of expertise Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 7+ years of Implementation experience on time-critical production projects following key software development practices 5+ years of programming experience in Python or any programming language Tools/Technologies: Programming Languages: Python, PySpark Cloud Technologies: Azure (ADF, Databricks, WebApp, Key vault, SQL Server, function app, logic app, Synapse Azure Machine Learning, DevOps) DevOps, implementation of Bigdata, Apache Spark and Azure Cloud Experience: Deep experience in Data Analysis, including source data analysis, data profiling and mapping Good experience in building data pipelines using ADF/Azure Databricks Proven hands-on experience with a large-scale data warehouse Hands-on data migration experience from legacy systems to new solutions, such as from on-premises clusters to Cloud Hands-on programming experience in Spark using scala/python Large scale data processing using PySpark on azure ecosystem Implementation of self-service analytics platform ETL framework using PySpark on Azure Expert skills in Azure data processing tools (Azure Data Factory, Azure Databricks) Solid proficiency in SQL and complex queries Ability to learn and adapt to new data technologies Proven good problem solving skills Proven good communication skills Preferred Qualifications Knowledge/Experience on Azure Synapse and Power BI Knowledge on US healthcare industry/Pharmacy data At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #Gen

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

We are seeking a highly experienced Data Architect with a strong background in designing scalable data solutions and leading data engineering teams. The ideal candidate will have deep expertise in Microsoft Azure , ETL processes , and modern data architecture principles. This role involves close collaboration with stakeholders, engineering teams, and business units to design and implement robust data pipelines and architectures. Assessments of existing data components, Performing POCs, Consulting to the stakeholders Proposing end to end solutions to an enterprise's data specific business problems, and taking care of data collection, extraction, integration, cleansing, enriching and data visualization Ability to design large data platforms to enable Data Engineers, Analysts & scientists Strong exposure to different Data architectures, data lake & data warehouse Design and implement end-to-end data architecture solutions on Azure cloud platform. Lead the design and development of scalable ETL/ELT pipelines using tools such as Azure Data Factory (ADF). Architect data lakes using Azure Data Lake Storage (ADLS) and integrate with Azure Synapse Analytics for enterprise-scale analytics. Collaborate with business analysts, data scientists, and engineers to understand data needs and deliver high-performing solutions. Define data models, metadata standards, data quality rules, and security protocols. Define tools & technologies to develop automated data pipelines, write ETL processes, develop dashboard & report and create insights Continually reassess current state for alignment with architecture goals, best practices and business needs DB modeling, deciding best data storage, creating data flow diagrams, maintaining related documentation Taking care of performance, reliability, reusability, resilience, scalability, security, privacy & data governance while designing a data architecture Apply or recommend best practices in architecture, coding, API integration, CI/CD pipelines Coordinate with data scientists, analysts, and other stakeholders for data-related needs Help the Data Science & Analytics Practice grow by mentoring junior Practice members, leading initiatives, leading Data Practice Offerings Provide thought leadership by representing the Practice / Organization on internal / external platforms Qualificatons: 8+ years of experience in data architecture, data engineering, or related roles. Translate business requirements into data requests, reports and dashboards. Strong Database & modeling concepts with exposure to SQL & NoSQL Databases Expertise in designing and writing ETL processes in Python/PySpark Strong data architecture patterns & principles, ability to design secure & scalable data lakes, data warehouse, data hubs, and other event-driven architectures Proven expertise in Microsoft Azure data services, especially ADF, ADLS, Synapse Analytics. Strong hands-on experience in designing and building ETL/ELT pipelines. Proficiency in data modeling, SQL, and performance tuning. Demonstrated leadership experience, with the ability to manage and mentor technical teams. Excellent communication and stakeholder management skills. Proficiency in data visualization tools like Tableau, Power BI or similar to create meaningful insights. Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. Good to have: Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect). Experience with modern data platforms, data governance frameworks, and real-time data processing tools. Benefits: Imagine a flexible work environment – whether it's the office, your home, or a blend of both. From interviews to onboarding, we embody a remote-first approach. You will be part of a global team, learning from top talent around the world and across cultures, speaking English everyday. Our global workforce enables our team to leverage global resources to accomplish our work in efficient and effective teams. We’re big on your well-being – as a company, we spend a whole trimester in our annual cycle focused on wellbeing. Whether it is taking advantage of fitness offerings, mental health plans (country-dependent), or simply leveraging generous time off, we want all of our team members operating at their best. Our professional services model enables us to accelerate career growth and development opportunities - across projects, offerings, and industries. We are an equal opportunity employer. It goes without saying that we live by values like Intrinsic Dignity and Open Collaboration to create cutting-edge technology AND reinforce our commitment to diversity - globally and locally. Join us and be a part of a global tech community! 🌍💼 Check out our Linkedin site and Careers page to learn more about what it’s like to be part of our #oneteam!

Posted 1 week ago

Apply

4.0 - 8.0 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Skill Set Required: Strong expertise in DAX Modeling (Minimum of 6 years of relevant experience) Hands-on experience with Power BI (reporting and modeling) Data Engineering exposure Proficiency in SQL and ETL processes Experience with Data Warehousing (working on terabyte-scale data) Familiarity with Azure and related data management tools Interested candidates can share their updated resume to rolly.martin@thompsonshr.com

Posted 1 week ago

Apply

75.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description About Advance Auto Parts Founded in Roanoke, VA in 1932, Advance Auto Parts is a leading automotive aftermarket retail parts provider that serves both professional installer and do-it-yourself Customers. As of July 13, 2019, Advance operated 4,912 stores and 150 Worldpac branches in the United States, Canada, Puerto Rico, and the U.S. Virgin Islands. The Company also serves 1,250 independently owned CARQUEST branded stores across these locations in addition to Mexico, the Bahamas, Turks, and Caicos and the British Virgin Islands. The company has a workforce of over 70,000 knowledgeable and experienced Team Members who are proud to provide outstanding service to their Customers, Communities, and each other every day. About AAP Global Capability Centre We are continually innovating and seeking to elevate the Customer experience at each of our stores. For an organization of our size and reach, today, it has become more critical than ever, to identify synergies and build shared capabilities. The AAP Global Capability Center, located in Hyderabad, is a step in this strategic direction that enables us to access a larger talent pool, unlock operational efficiencies and increase levels of collaboration. About Information Technology At Advance Auto Parts, the IT organization is embracing the digitization of retail and working to transform our organization as a leader in the modern age of retail. We are leading the way with DevOps, thinking cloud-first, and adopting modern approaches to infrastructure management. We realize Agile is more than a manifesto, and that applications need to be portable, event and service-oriented, and support a data analytics and data-first culture of the modern business. We are taking action to transform a 75-year-old company to be an industry leader in building a best-in-class omnichannel experience for its customers. Software Developer, Level 9 Job Summary As a seasoned Software Developer at Level 9, your role is integral in utilizing Azure Cloud with expertise in Azure services, API services, and database management a sophisticated blend of data analytics skills, domain knowledge, and technical prowess to translate data into actionable business insights. You will design, develop, and optimize cloud-based solutions, ensuring performance, scalability, and security with an in-depth understanding of technical design and frameworks. In this capacity, you possess the technical skills required to build functionality, troubleshoot issues, and act as the primary point of contact for interactions with business stakeholders, external partners, and internal collaborators. Essential Duties And Responsibilities Azure Cloud Developering: Manage and administer Microsoft Azure services, including provisioning, performance monitoring, security, and governance. Design and implement data pipelines for ingesting, transforming, and integrating data from various sources (MS SQL, DB2, APIs, Kafka, external vendor files, etc.). Ensure data integrity, identify inconsistencies, and oversee successful data releases in cloud environments. Develop strategies to optimize Azure cloud architecture, ensuring efficiency, security, and cost-effectiveness. API & Integration Build, consume, and maintain RESTful APIs and services using Postman and related tools. Work on microservices architectures, ensuring seamless data flow across integrated applications. Utilize Azure Linked Servers and other cloud-native database services. Develop and optimize MS SQL Server databases, including complex queries, stored procedures, data modelling, and tuning. Implement data warehousing principles (e.g., Slowly Changing Dimensions, Facts vs. Dimensions). Maintain applications with a focus on scalability, operational efficiency, and troubleshooting production issues. Collaboration & Process Improvement: Work closely with stakeholders, project managers, and cross-functional teams to understand business needs and deliver solutions. Identify and implement process improvements for data integration, governance, and cloud operations. Provide mentorship and technical guidance to junior developers. Investigate and resolve system issues across multiple platforms. Participate in an on-call rotation to support production systems as needed. Adapt to shifting priorities in a dynamic work environment. Required Qualifications Technical Skills Bachelor’s degree in computer science, Developering, or a related field 7+ years of experience in Azure Cloud developering, data integration, and pipeline development. Strong expertise in Azure Data Factory (ADF), Data Bricks, Azure Pipelines, and related cloud services. Hands-on experience with REST APIs, Postman, and JSON-based integrations. Proficiency in MS SQL Server, database modeling, and performance optimization. Familiarity with CI/CD tools (Azure DevOps, Jenkins, Git, etc.). Familiarity with Power BI, VS Code, and SQL Server permissions for ETL/reporting. Strong background in Agile/Scrum methodologies for project execution. Demonstrated knowledge in building, debugging, and maintaining enterprise cloud applications Soft Skills Excellent problem-solving and analytical skills with attention to detail. Strong collaboration, communication, and stakeholder management abilities. Ability to work independently and lead cross-functional teams. Proven track record of meeting deadlines and adapting to dynamic priorities. California Residents Click Below For Privacy Notice https://jobs.advanceautoparts.com/us/en/disclosures We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age national origin, religion, sexual orientation, gender identity, status as a veteran and basis of disability or any other federal, state or local protected class.

Posted 1 week ago

Apply

12.0 - 22.0 years

20 - 35 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Lead (Hands On) the team of Data Engineers Good Communication and strong in technical design decision making. Strong Experience in ETL / Dataware Ware House Strong Experience in Databricks, Unity Catalogue, Medallion Architecture, Data Lineage & Pyspark, CTE Good Experience in Data Analysis Strong Experience in SQL Queries, Stored Procedures, Views, Functions, UDF (User Defined Functions) Experience in Azure Cloud, ADF, Storage & Containers, Azure DB, Azure Datalake Experience in SQL Server Experince in Data Migration & Production Support"

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Senior Data Engineer Location: Hyderabad, Chennai & Bangalore Experience: 5+ Years Job Summary We are looking for a highly skilled and experienced Senior Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building and optimizing data pipelines and data architecture, as well as experience with big data technologies and Azure cloud services. You will work closely with cross-functional teams to ensure data is accessible, reliable, and ready for analytics and business insights. Mandatory Skills Advanced SQL Python or Scala for data engineering ETL pipeline development Cloud platforms (AWS/GCP/Azure) Azure 1st party services (ADF, Azure Databricks, Synapse, etc.) Big data tools (Spark, Hadoop) Data warehousing (Redshift, Snowflake, Big Query) Workflow orchestration tools (Airflow, Prefect, or similar) Key Responsibilities Design, develop, and maintain scalable and reliable data pipelines Demonstrate experience and leadership across two full project cycles using Azure Data Factory, Azure Databricks, and PySpark Collaborate with data analysts, scientists, and software engineers to understand data needs Implement data quality checks and monitoring systems Optimize data delivery and processing across a wide range of sources and formats Ensure security and governance policies are followed in all data handling processes Evaluate and recommend tools and technologies to improve data engineering capabilities Lead and mentor junior data engineers as needed Work with cross-functional teams in a dynamic and fast-paced environment Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or related field Certification in cloud platforms or big data technologies (preferred) 5+ years of data engineering experience Technical Skills Programming: Python, Scala, SQL Big Data: Spark, Hadoop, Hive Stream Processing: Storm, Spark Streaming Data Warehousing: Snowflake, Big Query, Redshift Cloud: AWS (S3, Lambda, Glue), GCP, Azure (ADF, Azure Databricks) Orchestration: Apache Airflow, Prefect, Luigi Databases: PostgreSQL, MySQL, NoSQL (MongoDB, Cassandra) Tools: Git, Docker, Kubernetes (basic), CI/CD Soft Skills Strong problem-solving and analytical thinking Excellent verbal and written communication Ability to manage multiple tasks and deadlines Collaborative mindset with a proactive attitude Strong analytic skills related to working with unstructured datasets Good to Have Experience with real-time data processing (Kafka, Flink) Knowledge of data governance and privacy regulations (GDPR, HIPAA) Familiarity with ML model data pipeline integration Work Experience Minimum 5 years of relevant experience in data engineering roles Experience with Azure 1st party services across at least two full project lifecycles Compensation & Benefits Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plans. Work-Life Balance: Flexible work hours Key Result Areas (KRAs) Timely development and delivery of high-quality data pipelines Implementation of scalable data architectures Collaboration with cross-functional teams for data initiatives Compliance with data security and governance standards Key Performance Indicators (KPIs) Uptime and performance of data pipelines Reduction in data processing time Number of critical bugs post-deployment Stakeholder satisfaction scores Successful data integrations and migrations Contact: hr@bigtappanalytics.com

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies