Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Hyderabad, Telangana
On-site
Job Information Date Opened 06/19/2025 Job Type Full time Work Experience 5+ years Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500032 Job Description As a Data Governance Lead at Kanerika, you will be responsible for defining, leading, and operationalizing the data governance framework, ensuring enterprise-wide alignment and regulatory compliance. Key Responsibilities: 1. Governance Strategy & Stakeholder Alignment Develop and maintain enterprise data governance strategies, policies, and standards. Align governance with business goals: compliance, analytics, and decision-making. Collaborate across business, IT, legal, and compliance teams for role alignment. Drive governance training, awareness, and change management programs. 2. Microsoft Purview Administration & Implementation Manage Microsoft Purview accounts, collections, and RBAC aligned to org structure. Optimize Purview setup for large-scale environments (50TB+). Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule scans, set classification jobs, and maintain collection hierarchies. 3. Metadata & Lineage Management Design metadata repositories and maintain business glossaries and data dictionaries. Implement ingestion workflows via ADF, REST APIs, PowerShell, Azure Functions. Ensure lineage mapping (ADF Synapse Power BI) and impact analysis. 4. Data Classification & Security Governance Define classification rules and sensitivity labels (PII, PCI, PHI). Integrate with MIP, DLP, Insider Risk Management, and Compliance Manager. Enforce records management, lifecycle policies, and information barriers. 5. Data Quality & Policy Management Define KPIs and dashboards to monitor data quality across domains. Collaborate on rule design, remediation workflows, and exception handling. Ensure policy compliance (GDPR, HIPAA, CCPA, etc.) and risk management. 6. Business Glossary & Stewardship Maintain business glossary with domain owners and stewards in Purview. Enforce approval workflows, standard naming, and steward responsibilities. Conduct metadata audits for glossary and asset documentation quality. 7. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps. Create pipelines for ingestion, lineage, glossary updates, tagging. Integrate with Power BI, Azure Monitor, Synapse Link, Collibra, BigID, etc. 8. Monitoring, Auditing & Compliance Set up dashboards for audit logs, compliance reporting, metadata coverage. Oversee data lifecycle management across its phases. Support internal and external audit readiness with proper documentation. Requirements 7+ years of experience in data governance and data management. Proficient in Microsoft Purview and Informatica data governance tools. Strong in metadata management, lineage mapping, classification, and security. Experience with ADF, REST APIs, Talend, dbt, and automation via Azure tools. Knowledge of GDPR, CCPA, HIPAA, SOX and related compliance needs.Skilled in bridging technical governance with business and compliance goals. Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana
On-site
Job Information Date Opened 06/19/2025 Job Type Full time Work Experience 5+ years Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500032 Job Description What You Will Do: As a Data Governance Architect at Kanerika, you will play a pivotal role in shaping and executing the enterprise data governance strategy. Your responsibilities include: 1. Strategy, Framework, and Governance Operating Model Develop and maintain enterprise-wide data governance strategies, standards, and policies. Align governance practices with business goals like regulatory compliance and analytics readiness. Define roles and responsibilities within the governance operating model. Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement Collaborate across IT, legal, business, and compliance teams to align governance priorities. Define stewardship models and create enablement, training, and communication programs. Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms Design scalable and modular data governance architecture. Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture Lead end-to-end implementation and management of Microsoft Purview. Configure RBAC, collections, metadata scanning, business glossary, and classification rules. Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management Architect metadata repositories and ingestion workflows. Ensure end-to-end lineage (ADF Synapse Power BI). Define governance over business glossary and approval workflows. 6. Data Classification, Access & Policy Management Define and enforce rules for data classification, access, retention, and sharing. Align with GDPR, HIPAA, CCPA, SOX regulations. Use Microsoft Purview and MIP for policy enforcement automation. 7. Data Quality Governance Define KPIs, validation rules, and remediation workflows for enterprise data quality. Design scalable quality frameworks integrated into data pipelines. 8. Compliance, Risk, and Audit Oversight Identify risks and define standards for compliance reporting and audits. Configure usage analytics, alerts, and dashboards for policy enforcement. 9. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance tools with Azure Monitor, Synapse Link, Power BI, and third-party platforms. Requirements 15+ years in data governance and management. Expertise in Microsoft Purview, Informatica, and related platforms. Experience leading end-to-end governance initiatives. Strong understanding of metadata, lineage, policy management, and compliance regulations. Hands-on skills in Azure Data Factory, REST APIs, PowerShell, and governance architecture. Familiar with Agile methodologies and stakeholder communication. Benefits 1. Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
Posted 1 week ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Cloud Visual Builder Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Analyze requirements, determine technical level of effort and prepare technical design and specifications. - Conversant in deploying and troubleshooting, analyzing, and resolving technical problems - Hands on writing SQL Queries - Conduct Design review to provide guidance and Quality assurance around best practices and frameworks Professional & Technical Skills: - Overall 4+ years of experience in Web App development (Oracle ADF) - 2 to 3 years of experience in Oracle VBCS (Visual Builder Cloud Service) - Good hands on knowledge in JavaScript, CSS3, XML/JSON/WSDL,Consuming Web Services(SOAP/REST),Testing Tools(Postman/SoapUI/JMeter) - Experience with building different types of application in VBCS using Business Object, ORDS - Knowledge and experience in integration with other Oracle PaaS services. - Experience with integrating VBCS applications with Oracle SaaS Applications - Work experience on development of SaaS extensions using VBCS - Experience of various web service related technologies such as WSDL/XML/SOAP/REST/JSON standards - Knowledge of Oracle database and PL/SQL - Experience in GIT-HUB, Oracle Developer Cloud and UCD tools for build and deployment - Good communication interpersonal skills. Good analytical and debugging skills Additional Information: - The candidate should have minimum 3 years of experience in Oracle Cloud Visual Builder. .- A 15 years full time education is required. 15 years full time education Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Internal Firm Services Industry/Sector Not Applicable Specialism Operations Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a business application consulting generalist at PwC, you will provide consulting services for a wide range of business applications. You will leverage a broad understanding of various software solutions to assist clients in optimising operational efficiency through analysis, implementation, training, and support. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: We are seeking a highly skilled IT Technical Operations Specialist to join our BAO Technical Operations team in an L3 support role. The ideal candidate will be quick to learn processes, procedures, and policy for the organization. The candidate should be eager to expand their skill set and energetic when dealing with new, complex technologies. Responsibilities: The candidate will support a number of cloud-based applications and infrastructure in GCP as well as Azure. The main technologies expected to be utilized in this role include Google Cloud Platform, Postgres, MongoDB, Azure Devops pipelines. Experience with database query language strongly desired (SQL, Postgres, Mongodb). Mongo Db Administration hands on knowledge would be a bonus Experience with development methodologies and release lifecycles. Proficiency with scripting languages is preferred (Powershell, bash) and hands on with CICD Experience with Azure DevOps for release management is ideal. Knowledge of ADF and Power bi and Power App is a good to have skills Experience with agile development practices is a bonus. Experience in Kubernetes and Containers and container Orchestrations Experience with Google Cloud Platform is a bonus. Strong communication skills and ability to work with global team members on a rotation. This role may participate in an on-call rotation that may include weekends and is primarily for early morning IST Shift Mandatory skill sets: Support our L2 & customer users on issues as well as our global teams on the health of the environment. · Collaborate with cross functional teams to design, implement, and maintain IT infrastructure solutions in alignment with business objectives. · Develop procedures to help support the environments, building of SOPs to leverage global teams and L2 resources. · Enhance common processes and support procedures with scripts, automations, and guide development teams in finding resolution when needed. · Support releases of new versions as well as the building of new environments. · Proactively work with the team and our monitoring solutions in order to stay ahead of issues and maintenance of the environments. · Analyze logs and metrics using Splunk to identify and address potential issues. · Stay up to date with emerging technologies and industry trends to drive innovation and enhance operational efficiency. Preferred skill sets: Mongo DB, SQL Years of experience required: 3+ yrs Education qualification: Bachelor’s degree in information technology Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Communication, Documentation Development, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Optimism, Performance Assessment, Performance Management Software, Problem Solving, Product Management, Product Operations, Project Delivery {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Consultant / Senior Consultant – Azure Data Engineering Location: India – Gurgaon preferred Industry: Insurance Analytics & AI Vertical Role Overview: We are seeking a hands-on Consultant / Senior Consultant with strong expertise in Azure-based data engineering to support end-to-end development and delivery of data pipelines for our insurance clients. The ideal candidate will have a deep understanding of Azure Data Factory, ADLS, Databricks (preferably with DLT and Unity Catalog), SQL, and Python and be comfortable working in a dynamic, client-facing environment. This is a key offshore role requiring both technical execution and solution-oriented thinking to support modern data platform initiatives. Collaborate with data scientists, analysts, and stakeholders to gather requirements and define data models that effectively support business requirements Demonstrate decision-making, analytical and problem-solving abilities Strong verbal and written communication skills to manage client discussions Familiar with working on Agile methodologies - daily scrum, sprint planning, backlog refinement Key Responsibilities & Skillsets: o Design and develop scalable and efficient data pipelines using Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS). o Build and maintain Databricks notebooks for data ingestion, transformation, and quality checks, using Python and SQL. o Work with Delta Live Tables (DLT) and Unity Catalog (preferred) to improve pipeline automation, governance, and performance. o Collaborate with data architects, analysts, and onshore teams to translate business requirements into technical specifications. o Troubleshoot data issues, ensure data accuracy, and apply best practices in data engineering and DevOps. o Support the migration of legacy SQL pipelines to modern Python-based frameworks. o Ensure adherence to data security, compliance, and performance standards, especially within insurance domain constraints. o Provide documentation, status updates, and technical insights to stakeholders as required. o Excellent communication skills and stakeholder management Required Skills & Experience: 3–7 years of strong hands-on experience in data engineering with a focus on Azure cloud technologies. Proficient in Azure Data Factory, Databricks, ADLS Gen2, and working knowledge of Unity Catalog. Strong programming skills in both SQL, Python especially within Databricks Notebooks. Pyspark expertise is good to have. Experience in Delta Lake / Delta Live Tables (DLT) is a plus. Good understanding of ETL/ELT concepts, data modeling, and performance tuning. Exposure to Insurance or Financial Services data projects is highly preferred. Strong communication and collaboration skills in an offshore delivery model. Required Skills & Experience: Experience working in Agile/Scrum teams Familiarity with Azure DevOps, Git, and CI/CD practices Certifications in Azure Data Engineering (e.g., DP-203) or Databricks Show more Show less
Posted 1 week ago
5.0 - 10.0 years
15 - 20 Lacs
Pune
Work from Office
AZURE DATA ENGINEER Skills - Strong technical experience in Azure, SQL , Azure data factory, ETL, Databricks Graduation must Experience- 5-10 years CTC- Up to 14 - 20 LPA 21st June -F2F Interview only (Pune) Contact- 7742324144
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche De Poste Job Title: Senior Business Analyst Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Business Analyst to join our project team responsible for delivering a Microsoft Azure-hosted web application with Angular as the frontend and .NET 8 as the backend framework. The solution follows a micro-frontend and microservices architecture integrated with Azure SQL database. Additionally, the data engineering component involves Azure Data Factory (ADF), Databricks, and Cosmos DB. The Senior Business Analyst will play a pivotal role in bridging the gap between business stakeholders, development teams, and data engineering teams. This role involves eliciting and analyzing requirements, defining business processes, and ensuring alignment of project objectives with strategic goals. The candidate will also work closely with architects, developers, and testers to ensure comprehensive requirements coverage and successful project delivery. Key Responsibilities Requirements Elicitation and Analysis: Gather and document business and technical requirements through stakeholder interviews, workshops, and document analysis. Analyze complex data flows and business processes to define clear and concise requirements. Create detailed requirement specifications, user stories, and acceptance criteria for both web application and data engineering components. Business Process Design and Improvement: Define and document business processes, workflows, and data models. Identify areas for process optimization and automation within web and data solutions. Collaborate with stakeholders to design solutions that align with business objectives. Stakeholder Communication and Collaboration: Serve as a liaison between business stakeholders, development teams, and data engineering teams. Facilitate communication and collaboration to ensure stakeholder alignment and understanding. Conduct requirement walkthroughs, design reviews, and user acceptance testing sessions. Solution Validation and Quality Assurance: Ensure requirements traceability throughout the project lifecycle. Validate and test solutions to ensure they meet business needs and objectives. Collaborate with QA teams to define testing strategies and acceptance criteria. Primary Skills Business Analysis: Requirement gathering, process modeling, and gap analysis. Documentation: User stories, functional specifications, and acceptance criteria. Agile Methodologies: Experience in Agile/Scrum environments. Stakeholder Management: Effective communication and collaboration with cross-functional teams. Data Analysis: Ability to analyze and interpret complex data flows and business processes. Secondary Skills Cloud Platform: Familiarity with Microsoft Azure services. Data Engineering: Understanding of data pipelines, ETL processes, and data modeling. UX/UI Collaboration: Experience collaborating with UX/UI teams for optimal user experience. Communication Skills: Excellent verbal and written communication for stakeholder engagement. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational And Preferred Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications such as: Certified Business Analysis Professional (CBAP) PMI Professional in Business Analysis (PMI-PBA) Microsoft Certified: Azure Fundamentals Experience in cloud-native solutions and microservices architecture. Familiarity with Angular and .NET frameworks for web applications. About The Team As a Senior Business Analyst , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés. Show more Show less
Posted 1 week ago
7.0 - 10.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Microsoft Sustainability Manager Senior Developer – Consulting As a developer working in the GDS Consulting team within the Digital & Emerging team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale in Microsoft Cloud for Sustainability industry cloud. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Design and build Model Driven Apps for a variety of business needs, ensuring efficient data models, logical relationships, and optimized user interfaces. Design and develop Model Driven Apps (MDAs) focused on sustainability initiatives, such as carbon footprint tracking, resource management, and supply chain optimization. Configure and customize Microsoft Sustainability Manager (MSM) solutions to meet specific client needs and industry challenges. Design and build engaging dashboards and report in Power BI to visualize sustainability data and track progress towards goals. Develop and maintain KPI models to measure and track key performance indicators for our sustainability initiatives. Collaborate with data analysts, scientists, and other stakeholders to understand complex data models and ensure accurate and reliable data visualization. Stay updated on the latest trends and technologies in sustainable software development and apply them to our solutions. Understanding on Microsoft Cloud for Sustainability Common Data model. Skills And Attributes For Success Proven experience as a Microsoft Cloud for Sustainability industry cloud developer or equivalent development role, with a strong focus on Model Driven Apps within the Microsoft Power Platform and Azure. In-depth understanding of data modelling principles and experience designing efficient data models in Microsoft Dataverse. Experience in Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Custom Pages, Power Portals/ Power Pages), Dynamics CRM / 365. Strong coding experience in Model Driven App Development including Plugin Development, PCF component, Ribbon Customization, FetchXML and XRM APIs. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Strong and proven experience in creating custom forms with validations using JavaScript Experience in developing PCF components is an added advantage. Expertise in building user interfaces using the Model Driven App canvas and customizing forms, views, and dashboards. Proficiency in Power Automate for workflow automation and logic implementation. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions, Data Lake Experience with integration techniques, including connectors and custom APIs (Application Program Interface). Experience in Power BI, including advanced functions and DAX scripting, advance Power Query, data modelling on CDM. Experience in Power FX is an added advantage Strong knowledge of Azure DevOps & CI/CD pipelines and its setup for Automated Build and Release Management Experience in leading teams to execute high quality deliverables within stipulated timeline. Excellent Written and Communication Skills Ability to deliver technical demonstrations. Quick learner with “can do” attitude. Demonstrating and applying strong project management skills, inspiring teamwork, and responsibility with engagement team members To qualify for the role, you must have. A bachelor's or master's degree A minimum of 7-10 years of experience, preferably background in a professional services firm. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Title: Business Analyst and Business Intelligence Developer (Digital Solution Team)- Husky (India)Chennai Id: 20036 Type: FullTime Location: Chennai, India Job Description Job Purpose The DST Business Analyst and Business Intelligence Developer for Husky will be responsible for building the business intelligence system for the company, based on the internal and external data structures. Responsible for leading the design and support of enterprise-wide business intelligence applications and architecture. Works with enterprise-wide business and IT senior management to understand and prioritize data and information requirements. Solves complex technical problems. Optimizes the performance of enterprise business intelligence tools by defining data elements which contribute to data insights which add value to the user. Creates testing methodology and criteria. Designs and coordinates a curriculum for coaching and training customers in the use of business intelligence tools to enhance business decision-making capability. Develops standards, policies, and procedures for the form, structure, and attributes of the business intelligence tools and systems. Develops data/information quality metrics. Researches new technology and develops business cases to support enterprise-wide business intelligence solutions. Key Responsibilities & Key Success Metrics Leading BI software development, deployment and maintenance Perform Data Profiling and Data Analysis activities to understand data sources Report curation, template definition and analytical data modeling Work with cross-functional teams to gather and document reporting requirements Translate business requirements into specifications that will be used to implement the required reports and dashboards, created from potentially multiple data sources Identifies and resolves data reporting issues in a timely fashion, while looking for continuous improvement opportunities. Build solutions that create value and resolve business problems Provide technical guidance to designers and other stakeholders Work effectively with members of Digital Solutions Team Troubleshoots analytics tool problems and tunes for performance Develops semantic layer and analytics query objects for end users Translation of business questions and requirements into reports, views, and analytics query objects Ensuring that quality standards are met Supporting Master Data Management Strategy Qualifications Understanding of ERP and Operational systems databases, knowledge of database programming Highly skilled at writing SQL queries with large scale, complex datasets Experience in data visualization and data storytelling Experience designing, debugging and deploying software in ADO (Azure Dev/Ops) development environment Experience with Microsoft BI stack - Power BI and SQL Server Analysis Services Experience working in an international business environment Experience with Azure Data Platform resources (ADLS, ADF, Azure Synapse, Power BI Services) Basic manufacturing and sales business process knowledge Strong communication & presentation skills Ability to moderate meetings and constructive design sessions for effective decision making English language skills are a requirement, German & French are considered an asset Show more Show less
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Mumbai, Hyderabad, Chennai
Work from Office
Type: Contract | Duration: 6 Months We are seeking an experienced Data Engineer to join our team for a 6-month contract assignment. The ideal candidate will work on data warehouse development, ETL pipelines, and analytics enablement using Snowflake, Azure Data Factory (ADF), dbt, and other tools. This role requires strong hands-on experience with data integration platforms, documentation, and pipeline optimizationespecially in cloud environments such as Azure and AWS. Key Responsibilities Build and maintain ETL pipelines using Fivetran, dbt, and Azure Data Factory Monitor and support production ETL jobs Develop and maintain data lineage documentation for all systems Design data mapping and documentation to aid QA/UAT testing Evaluate and recommend modern data integration tools Optimize shared data workflows and batch schedules Collaborate with Data Quality Analysts to ensure accuracy and integrity of data flows Participate in performance tuning and improvement recommendations Support BI/MDM initiatives including Data Vault and Data Lakes Required Skills 7+ years of experience in data engineering roles Strong command of SQL, with 5+ years of hands-on development Deep experience with Snowflake, Azure Data Factory, dbt Strong background with ETL tools (Informatica, Talend, ADF, dbt, etc.) Bachelor's in CS, Engineering, Math, or related field Experience in healthcare domain (working with PHI/PII data) Familiarity with scripting/programming (Python, Perl, Java, Linux-based environments) Excellent communication and documentation skills Experience with BI tools like Power BI, Cognos, etc. Organized, self-starter with strong time-management and critical thinking abilities Nice To Have Experience with Data Lakes and Data Vaults QA & UAT alignment with clear development documentation Multi-cloud experience (especially Azure, AWS) Location Options: Hyderabad / Chennai (Remote flexibility available) Apply to: navaneeta@suzva.com Contact: 9032956160
Posted 1 week ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica,Architect,Azure Datafactory We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Skills: skills,azure datafactory,requirement gathering,data analysis,sql,etl,snowflake,data modeling,azure,power bi,python,business intelligence,informatica,fivetran,dbt,pipelines,data warehousing,data,dwh Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Career Level - IC4 Responsibilities Education & Experience: BE, BTech, MCA , CA or equivalent preferred. Other qualifications with adequate experience may be considered. 5+ years relevant working experience ##Functional/Technical Knowledge & Skills: Must have good understanding of the following Oracle Cloud Financials version 12+ capabilities: We are looking for a techno-functional person who has real-time hands-on functional/product and/or technical experience; and/or worked with L2 or L3 level support; and/or having equivalent knowledge. We expect candidate to have: Strong business processes knowledge and concepts. Implementation/Support experience on either of the area - ERP - Cloud Financial Modules like GL, AP, AR, FA, IBY, PA, CST, ZX and PSA or HCM - Core HR, Benefits, Absence, T&L, Payroll, Compensation, Talent Management or SCM - Inventory, OM, Procurement Candidate must have hands on experience minimum in any of the 5 modules on the above pillars. Ability to relate the product functionality to business processes, and thus offer implementation advices to customers on how to meet their various business scenarios using Oracle Cloud Financials. Technically Strong with Expert Skills in SQL, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, ADF Faces, BI Extract for FTP, Payment Integration and Personalisation. Ability to relate the product functionality to business processes, and thus offer implementation advice to customers on how to meet their various business scenarios using Oracle Cloud. Strong problem solving skills. Strong Customer interactions and service orientation so you can understand customer’s critical situations and accordingly provide the response, and mobilise the organisational resources, while setting realistic expectations to customers. Strong operations management and innovation orientation so you can continually improve the processes, methods, tools, and utilities. Strong team player so you leverage each other’s strengths. You will be engaged in collaboration with peers within/across the teams often. Strong learning orientation so you keep abreast of the emerging business models/processes, applications product solutions, product features, technology features – and use this learning to deliver value to customers on a daily basis. High flexibility so you remain agile in a fast changing business and organisational environment. Create and maintain appropriate documentation for architecture, design, technical, implementation, support and test activities. # Personal Attributes: Self driven and result oriented Strong problem solving/analytical skills Strong customer support and relation skills Effective communication (verbal and written) Focus on relationships (internal and external) Strong willingness to learn new things and share them with others Influencing/negotiating Team player Customer focused Confident and decisive Values Expertise (maintaining professional expertise in own discipline) Enthusiasm Flexibility Organizational skills Values and enjoys coaching/knowledge transfer ability Values and enjoys teaching technical courses Note: Shift working is mandatory. Candidate should be open to work in evening and night shifts on rotation basis. Career Level - IC3/IC4/IC5 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role/ Designation : PowerBI Developer (Data Engineer) Experience : 3 Years+ in PowerBI Location : Hyderabad / Ahmedabad Role Objective We are looking for a highly motivated and experienced Senior Power BI Engineer to join our team of data experts. The ideal candidate will have a strong background in designing, developing, and maintaining PowerBi Dashboards & reports . As a Power BI Engineer, you will work closely with the Lead Data Engineer and Data Architect to implement end-to-end data solutions, build, and maintain data pipelines, and ensure the quality and integrity of our organization's data. Roles & Responsibilities Study, analyze and understand business requirements in context to business intelligence. Design and map data models to shift raw data into meaningful insights. Utilize Power BI to build interactive and visually appealing dashboards and reports. Spot key performance indicators with apt objectives Analyze pervious and present data for better decision making Transform business requirements into technical publications Build multi-dimensional data models Develop strong data documentation about algorithms, parameters, models Perform detailed analysis on tested and deployed Power BI scripts Run DAX queries and functions in Power BI Define and design new systems Take care of data warehouse development Building Analysis Services reporting models. Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. Connecting data sources, importing data, and transforming data for Business intelligence. Analytical thinking for translating data into informative reports and visuals. Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI. Make essential technical and strategic changes to improvise present business intelligence systems. Identify the requirements and develop custom charts accordingly. SQL querying for better results Skills & Experience Required Bachelor's or Master's degree in Computer Science, Information Systems, or a related field Candidate must have minimum 3 years of hands-on experience on Power BI Desktop as well as Power BI Service. Preferred candidate with PL-300 Certification Must be proficient with DAX (Data Analysis Expressions). Be familiar with MS SQL Server BI Stack tools and technologies, such as SSRS and TSQL, Power Query, MDX, PowerBI, and DAX Should be well versed with Power Query. Should have knowledge of SQL (Structured Query Language). Should be good with Data Modelling, and ETL Operations Should have experience on MSBI (Microsoft Business Intelligence) Stack - SSIS: SQL Server Integration Services,SSAS: SQL Server Analysis Services, SSRS: SQL Server Reporting Services. Experience working with Azure (ADF, Synapse, AAS). Expertise in Power BI Service Management. Proficient in doing advanced-level computations on the data set Excellent communication skills are required to communicate needs with client and internal teams successfully Show more Show less
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Hybrid
•Strong experience as an AWS/Azure/GCP Data Engineer & must have AWS/Azure/GCP Databricks experience. •Expert proficiency in Spark Scala, Python, spark,ADF & SQL •Design & develop applications on Databricks. NP-Immediate Email- sachin@assertivebs.com
Posted 1 week ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Role Description We are seeking a highly skilled Data Architect to design, develop, and maintain end-to-end data architecture solutions, leveraging leading-edge platforms such as Snowflake , Azure , and Azure Data Factory (ADF) . The role involves translating complex business requirements into scalable, secure, and high-performance data solutions while enabling analytics, business intelligence (BI), and machine learning (ML) initiatives. Key Responsibilities Data Architecture & Design: Design and develop end-to-end data architectures for integration, storage, processing, and analytics using Snowflake and Azure services. Build scalable, reliable, and high-performing data pipelines to handle large volumes of data, utilizing Azure Data Factory (ADF) and Snowflake. Create and maintain data models (dimensional and relational) optimized for query performance and analytics using Azure Synapse Analytics and Azure Analysis Services (AAS). Define and implement data governance standards, data quality processes, and security protocols across all data solutions. Cloud Data Platform Management Architect and manage data solutions on Azure Cloud, ensuring seamless integration with services like Azure Blob Storage, Azure SQL, and Azure Synapse. Leverage Snowflake for data warehousing to ensure high availability, scalability, and performance. Design data lakes and data warehouses using Azure Synapse, creating architecture patterns for large-scale data storage and retrieval. Data Integration & ETL Development Lead the design and development of ETL/ELT pipelines using Azure Data Factory (ADF) to integrate data from various sources into Snowflake and other Azure-based data stores. Develop data transformation workflows using Python and ADF to process raw data into analytics-ready formats. Design and implement efficient ETL strategies using a combination of Python, ADF, and Snowflake. Analytics & Business Intelligence (BI) Design and implement data models for BI and reporting solutions using Azure Analysis Services (AAS) and Power BI. Create efficient data pipelines and aggregation strategies to support real-time and historical reporting across the organization. Implement best practices for data modeling to support business decision-making with tools like Power BI, AAS, and Synapse. Advanced Data Solutions (AI/ML Integration) Collaborate with data scientists and engineers to integrate machine learning (ML) and AI models into data pipeline architecture. Ensure that the data architecture is optimized for AI-driven insights and large-scale, real-time analytics. Collaboration & Stakeholder Engagement Work with cross-functional teams, including business analysts, data engineers, data scientists, and IT teams, to understand data requirements and align with business goals. Provide technical leadership, guiding development teams and ensuring adherence to architectural standards and best practices. Effectively communicate complex data architecture concepts to non-technical stakeholders, translating business needs into actionable solutions. Performance & Optimization Continuously monitor and optimize data solutions, ensuring fast, scalable data queries, transformations, and reporting functions. Troubleshoot and resolve performance bottlenecks in data pipelines and architecture, ensuring minimal downtime and high availability. Implement strategies for data archiving, partitioning, and optimization in Snowflake and Azure Synapse environments. Security & Compliance Design and implement robust security frameworks to protect sensitive data across Snowflake, Azure Synapse, and other cloud platforms. Ensure data privacy and compliance with industry regulations (e.g., GDPR, CCPA) through necessary security controls and access policies. Skills Snowflake, Azure databricks, Python Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Data Engineer Job Summary Data Engineers will be responsible for the design, development, testing, maintenance, and support data assets including; Azure Data Lake and data warehouse development, modeling, package creation, SQL script creation, stored procedure development, integration services support among other responsibilities. Candidate have at least 3-5 years hands-on Azure experience as a Data Engineer, must be an expert in SQL and have extensive expertise building data pipelines. Candidate will be accountable for meeting deliverable commitments including schedule and quality compliance. This Candidate must have skills to plan and schedule own work activities, coordinate activities with other cross-functional team members to meet project goals. Basic Understanding Of Scheduling and workflow management & working experience in either ADF, Informatica, Airflow or Similar Enterprise Data Modelling and Semantic Modelling & working experience in ERwin, ER/Studio, PowerDesigner or Similar Logical/Physical model on Big Data sets or modern data warehouse & working experience in ERwin, ER/Studio, PowerDesigner or Similar Agile Process (Scrum cadences, Roles, deliverables) & basic understanding in either Azure DevOps, JIRA or Similar Architecture and data modelling for Data Lake on cloud & working experience in Amazon WebServices (AWS), Microsoft Azure, Google Cloud Platform (GCP) Basic understanding of Build and Release management & working experience in Azure DevOps, AWS CodeCommitt or Similar Strong In Writing code in programming language & working experience in Python, PySpakrk, Scala or Similar Big Data Framework & working experience in Spark or Hadoop or Hive (incl. derivatives like pySpark (prefered), SparkScala or SparkSQL) or Similar Data warehouse working experience of concepts and development using SQL on single (SQL Server, Oracle or Similar) and parallel platforms (Azure SQL Data Warehouse or Snowflake) Code Management & working experience in GIT Hub, Azure DevOps or Similar End to End Architecture and ETL processes & working experience in ETL Tool or Similar Reading Data Formats & working experience in JSON, XML or Similar Data integration processes (batch & real time) using tools & working experience in either Informatica PowerCenter and/or Cloud, Microsoft SSIS, MuleSoft, DataStage, Sqoop or Similar Writing requirement, functional & technical documentation & working experience in Integration design document, architecture documentation, data testing plans or Similar SQL queries & working experience in SQL code or Stored Procedures or Functions or Views or Similar Database & working experience in any of the database like MS SQL, Oracle or Similar Analytical Problem Solving skills & working experience in resolving complex problems or Similar Communication (read & write in English), Collaboration & Presentation skills & working experience as team player or Similar Good To Have Stream Processing & working experience in either Databricks Streaming, Azure Stream Analytics or HD Insight or Kinesis Data Analytics or Similar Analytical Warehouse & working experience in either SQL Data Warehouse or Amazon Athena or AWS Redshift or Big Query or Similar Real-Time Store & working experience in either Azure Cosmos DB or Amazon Dynamo-DB or Cloud Bigdata or Similar Batch Ingestion & working experience in Data Factory or Amazon Kinesis or Lambda or Cloud Pub/Sub or Similar Storage & working experience in Azure Data Lake Storage GEN1/GEN2 or Amazon S3 or Cloud Storage or Similar Batch Data Processing & working experience in either Azure Databricks or HD Insight or Amazon EMR or AWS Glue or Similar Orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Summary We are seeking a highly skilled and experienced Databricks Developer to join our data engineering team. The ideal candidate will have over 4 years of experience working with Databricks on Azure, along with a strong background in data pipelines, and performance optimization. The candidate will be responsible for developing scalable data processing solutions, ensuring data quality, and enabling advanced analytics initiatives. Key Responsibilities Design, develop, and optimize data pipelines and ETL processes using Azure Databricks (PySpark /Spark SQL). Collaborate with data architects, analysts, and other developers to deliver data solutions aligned with business requirements. Perform data wrangling, cleansing, transformation, and aggregation from multiple sources. Implement and maintain data lake and data warehouse solutions using Azure services (ADLS, Synapse, Delta Lake). Monitor pipeline performance, troubleshoot issues, and ensure data integrity and reliability. Use Delta Lake for building robust and scalable data architectures. Develop and maintain CI/CD pipelines for Databricks workflows and jobs. Participate in code reviews, unit testing, and documentation of data processes. Required Skills & Experience 4+ years of hands-on experience with Databricks on Azure. Strong expertise in PySpark, Spark SQL, and Delta Lake. Solid understanding of Azure Data Services: Azure Data Lake Storage (ADLS), Azure Data Factory (ADF) Proficiency in Python for data processing tasks. Experience with data ingestion from various sources (on-prem, cloud, APIs). Knowledge of data modeling, data governance, and performance tuning. Familiarity with CI/CD tools (Azure DevOps, Git) and job orchestration in Databricks. Strong problem-solving skills and ability to work independently or as part of a team. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
3 - 6 Lacs
Hyderābād
On-site
SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Lead Programmer Analyst (Experience: 5 to 8 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Lead Programmer Analyst: At least 5 years of experience in designing and developing Data Pipelines & Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof. Hands on programmer with a thorough understand of performance tuning techniques. Handling large data volume transformations (order of 100 GBs monthly). Able to create solution / data flows to suit requirements. Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter & learner. Able to understand and probe for requirements. Tech experience expected. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL or any data processing tool. Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.
Posted 1 week ago
3.0 - 5.0 years
4 - 6 Lacs
Hyderābād
On-site
SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Senior Programmer Analyst (Experience: 3 to 5 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) – Senior Programmer Analyst: Experience should have 3 years to 5 years of data engineering with SnowFlake and DBT. Experience (internships, academic projects, or entry-level roles) in designing and developing data pipelines. Exposure to Cloud data warehouse - Snowflake through coursework, projects, or training. Basic understanding of ETL tools like Azure Data Factory, Fivetran, or DBT (hands-on experience in academic settings or internships preferred). Familiarity with Git and Azure DevOps for version control and CI/CD processes. Understanding of Agile methodologies, Jira, and Confluence. Knowledge of SQL programming (views, functions, stored procedures); ability to write and optimize basic SQL queries. Exposure to Azure architecture and Data Lake concepts. Eager to learn, with a proactive approach to problem-solving. Ability to understand business requirements and ask the right questions to clarify tasks. Basic understanding of performance tuning and handling moderate-sized data transformations. Ability to create simple data flows and assist in solution design under guidance. Willingness to contribute to documentation (e.g., mapping, defect logs). Bachelor's degree in Computer Science, Statistics, or a related field. Strong foundational knowledge of SQL for querying databases. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL tools, or any data processing framework. Nice to have: Basic understanding of healthcare data and domain-specific concepts. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.
Posted 1 week ago
3.0 years
0 Lacs
Hyderābād
On-site
Job Description Job Location: Hyderabad Job Duration: Full time Hours: 9:00am to 5:00pm We are seeking a hands-on Data Engineer with a strong focus on data ingestion to support the delivery of high-quality, reliable, and scalable data pipelines across our Data & AI ecosystem. This role is essential in enabling downstream analytics, machine learning, and business intelligence solutions by ensuring robust and automated data acquisition from various internal and external sources. Key Responsibilities Design, build, and maintain scalable and reusable data ingestion pipelines to onboard structured and semi-structured data from APIs, flat files, databases, and external systems. Work with Azure-native services (e.g., Data Factory, Azure Data Lake, Event Hubs) and tools like Databricks or Apache Spark for data ingestion and transformation. Develop and manage metadata-driven ingestion frameworks to support dynamic and automated onboarding of new sources. Collaborate closely with source system owners, analysts, and data stewards to define data ingestion specifications and implement monitoring/alerting on ingestion jobs. Ensure data quality, lineage, and governance principles are embedded into ingestion processes. Optimize ingestion processes for performance, reliability, and cloud cost efficiency. Support batch and real-time ingestion needs, including streaming data pipelines where applicable. Technical Experience 3+ years of hands-on experience in data engineering – bonus: with a specific focus on data ingestion or integration. Hands-on experience with Azure Data Services (e.g., ADF, Databricks, Synapse, ADLS) or equivalent cloud-native tools. Experience in Python (PySpark) for data processing tasks. (bonus: SQL knowledge) Experience with ETL frameworks, orchestration tools, and working with API-based data ingestion. Familiarity with data quality and validation strategies, including schema enforcement and error handling. Good understanding of CI/CD practices, version control, and infrastructure-as-code (e.g., Terraform, Git). Bonus: Experience with streaming ingestion (e.g., Kafka, Event Hubs, Spark Structured Streaming).
Posted 1 week ago
6.0 years
0 Lacs
India
On-site
Company Description Beyond Key specializes in driving Digital Transformation and Enterprise Modernization, leveraging deep technical expertise and AI capabilities. We serve industries such as Insurance, Non-Profit, Financial Services, Healthcare, and Manufacturing, focusing on customized growth and efficiency. Our commitment to delivering the right solutions has earned us prestigious awards, solidifying our position as a trusted technology partner. Recognized as a Great Place to Work, Beyond Key also boasts multiple awards for innovation, inclusivity, and excellence. We are dedicated to redefining possibilities with technology and innovation to help clients achieve their digital goals. Experience: 6+ years preferred Job Summary We’re looking for a hands-on Azure DevOps & Data Engineer who can bridge the gap between platform automation and data engineering. You’ll work on automating and optimizing our Azure data pipelines and deployments using Azure DevOps, Logic Apps, Data Factory, and SQL-based solutions. The role requires strong command over T-SQL and experience managing workflows and releases in a modern Azure setup. Key Responsibilities Azure DevOps - Build and maintain CI/CD pipelines for deploying ADF, SQL scripts, Logic Apps, and other data components. - Manage Azure DevOps Repos, Pipelines, and Releases for consistent deployments. - Set up deployment automation and rollback mechanisms across dev, test, and prod. Azure Data Services - Design and manage data pipelines using Azure Data Factory (ADF) — linked services, triggers, and parameterized workflows. - Develop and maintain Azure SQL Database and Azure SQL Managed Instance objects. - Leverage Azure Logic Apps to orchestrate workflows, alerting, approvals, and integrations with other systems. SQL - Write and optimize complex SQL queries, stored procedures, and functions. - Perform query tuning, indexing, and data integrity checks. - Work with large datasets and troubleshoot performance issues. Monitoring & Maintenance - Set up monitoring and alerting using Azure Monitor, Log Analytics, or custom alerts in ADF and Logic Apps. - Handle data job failures, pipeline errors, and CI/CD release troubleshooting. Collaboration & Documentation - Collaborate with data analysts, business users, and platform engineers. - Maintain up-to-date documentation of pipeline workflows, release notes, and known issues. Required Skills - Solid experience with Azure DevOps (Pipelines, Repos, Releases). - Hands-on expertise in Azure Data Factory, Azure Logic Apps, Azure SQL Database, and SQL Managed Instance. - Strong command over SQL (SPs, UDFs, performance tuning, query plans). - Good understanding of Git-based source control and branching models. - Experience in troubleshooting integration flows and ETL/ELT processes. Nice-to-Have (Not Mandatory) - Exposure to Power BI, Data Lake. - Basic scripting in PowerShell or Python. - Understanding of RBAC, resource tagging, and cost monitoring in Azure. Soft Skills - Strong analytical and debugging skills. - Proactive communicator and collaborator. - Able to handle multiple deployments and shifting priorities. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Are you passionate about building resilient, scalable, and secure cloud platforms? Join the Platform Engineering team at Xebia , where we are transforming enterprise data landscapes with cutting-edge cloud-native architectures and DevOps-driven delivery. This role is ideal for engineers who thrive at the intersection of Python, Azure, Big Data, and DevOps , and are ready to lead by design and automation. What You’ll Do: Design, build, and automate robust cloud platforms on Azure Enable data-driven architectures using Azure PaaS and Cloudera stack Ensure performance, security, and reliability across scalable systems Drive infrastructure automation and deployment with modern DevOps tooling Collaborate with cross-functional teams to deliver platform solutions at scale Your Tech Superpowers: We’re looking for engineers with hands-on expertise in: 🔹 Programming & Platform Services: Python Azure PaaS: Event Hub, ADF, Azure Functions, Databricks, Synapse, Cosmos DB, ADLS Gen2 🔹 Cloud & Big Data: Microsoft Azure Cloudera ecosystem 🔹 Data Tools & Visualizations: Dataiku, Power BI, Tableau 🔹 DevOps & Infrastructure as Code (IaC): CI/CD, GitOps, Terraform Docker & Kubernetes 🔹 Security & Networking: Inter-service communication & resilient Azure architecture Identity & Access Management (IAM) 🔹 Ways of Working: Agile mindset Clean code, automation-first, test-driven practices Why Join Xebia? Competitive compensation & world-class benefits Work with global clients on modern engineering challenges Upskill through structured learning, certifications & mentorship A culture built on trust, innovation & ownership Freedom to build, lead, and grow without limits Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Title: Senior Business Analyst Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Business Analyst to join our project team responsible for delivering a Microsoft Azure-hosted web application with Angular as the frontend and .NET 8 as the backend framework. The solution follows a micro-frontend and microservices architecture integrated with Azure SQL database. Additionally, the data engineering component involves Azure Data Factory (ADF), Databricks, and Cosmos DB. The Senior Business Analyst will play a pivotal role in bridging the gap between business stakeholders, development teams, and data engineering teams. This role involves eliciting and analyzing requirements, defining business processes, and ensuring alignment of project objectives with strategic goals. The candidate will also work closely with architects, developers, and testers to ensure comprehensive requirements coverage and successful project delivery. Key Responsibilities Requirements Elicitation and Analysis: Gather and document business and technical requirements through stakeholder interviews, workshops, and document analysis. Analyze complex data flows and business processes to define clear and concise requirements. Create detailed requirement specifications, user stories, and acceptance criteria for both web application and data engineering components. Business Process Design and Improvement: Define and document business processes, workflows, and data models. Identify areas for process optimization and automation within web and data solutions. Collaborate with stakeholders to design solutions that align with business objectives. Stakeholder Communication and Collaboration: Serve as a liaison between business stakeholders, development teams, and data engineering teams. Facilitate communication and collaboration to ensure stakeholder alignment and understanding. Conduct requirement walkthroughs, design reviews, and user acceptance testing sessions. Solution Validation and Quality Assurance: Ensure requirements traceability throughout the project lifecycle. Validate and test solutions to ensure they meet business needs and objectives. Collaborate with QA teams to define testing strategies and acceptance criteria. Primary Skills Business Analysis: Requirement gathering, process modeling, and gap analysis. Documentation: User stories, functional specifications, and acceptance criteria. Agile Methodologies: Experience in Agile/Scrum environments. Stakeholder Management: Effective communication and collaboration with cross-functional teams. Data Analysis: Ability to analyze and interpret complex data flows and business processes. Secondary Skills Cloud Platform: Familiarity with Microsoft Azure services. Data Engineering: Understanding of data pipelines, ETL processes, and data modeling. UX/UI Collaboration: Experience collaborating with UX/UI teams for optimal user experience. Communication Skills: Excellent verbal and written communication for stakeholder engagement. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational And Preferred Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications such as: Certified Business Analysis Professional (CBAP) PMI Professional in Business Analysis (PMI-PBA) Microsoft Certified: Azure Fundamentals Experience in cloud-native solutions and microservices architecture. Familiarity with Angular and .NET frameworks for web applications. About The Team As a Senior Business Analyst , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane