Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 15.0 years
0 Lacs
India
On-site
Data Analyst Experience : 12 to 15 Years Location : Bangalore, Pune, Hyderabad, Chennai, Noida, Kolkata, Mumbai Interview mode: 1st Virtual , L2 or HR round will be F2F Mandatory Skills - Data Analyst - Minimum 10 yr experience Required Data profiling - Minimum 5 yr experience Required SQL - Minimum 8 experience required Data Quality tools ( IDMC OR Alteryx ) - Minimum 5 yr experience Required JD Primary Skills: Strong proficiency in SQL Data Profiling and Data Quality tools (e.g., IDMC, Alteryx) Excellent communication and collaboration skills Secondary Skills: Experience with ADF, Snowflake, and Databricks Knowledge of Spark and any ETL tools (SSIS, Informatica, etc.) Power BI for data analysis and reporting Domain knowledge in Claims and Insurance Job Responsibilities: Act as the primary point of contact for platform-related inquiries Communicate platform updates and changes to relevant teams and stakeholders Collaborate effectively with multiple stakeholders across teams Facilitate coordination between development teams and other departments dependent on the platform Work within Agile practices to ensure timely and efficient project delivery Utilize data profiling and quality tools to ensure integrity and consistency of data Good to Have: Strong understanding of the Insurance domain and Claims processing Show more Show less
Posted 1 month ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Analyze, design develop, and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Career Level - IC3 Responsibilities Preferred Qualifications: Oracle Applications Lab (OAL) has a central role within Oracle. It's role is to work with Product Development and Oracle internal business to deliver Oracle products for Oracle to use internally. OAL has a role of implementing Oracle applications, databases and middleware, supporting Oracle applications for Oracle internally and configuring Oracle applications to meet the specific needs of Oracle. OAL also provides a showcase for Oracle's products The role will involve: Working as part of a global team to implement and support new business applications for HR and Payroll Debugging and solving sophisticated problems and working closely with Oracle Product Development and other groups to implement solutions Developing and implementing product extensions and customizations Testing new releases Providing critical production support Your skills should include: Experience in designing and supporting Oracle E-Business Suite and Fusion applications. Preferably Oracle HRMS/Fusion HCM Strong Oracle technical skills: SQL, PL/SQL, Java, XML, ADF, SOA etc Communicating confidently with peers and management within technical and business teams Detailed Description and Job Requirements: Work with Oracle's world class technology to develop, implement, and support Oracle's global infrastructure. As a member of the IT organization, assist with the analyze of existing complex programs and formulate logic for new complex internal systems. Prepare flowcharting, perform coding, and test/debug programs. Develop conversion and system implementation plans. Recommend changes to development, maintenance, and system standards. Job duties are varied and complex using independent judgment. May have project lead role. BS or equivalent experience in programming on enterprise or department servers or systems. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
12.0 years
0 Lacs
India
On-site
Job Title: Lead / Architect - Azure Data Factory Location : Mumbai/Pune/Bangalore/Hyderabad/Chennai/Kolkata/Noida Exp: 12 to 15 years Interview Mode – 1 Virtual / 1 F2F Mandatory Skills: Azure Data Factory (8+yrs) , Data Migration & Integration (8+yrs) , Azure Devops (5+yrs) , Lead or Architect experience (4+yrs) Job Description We are looking for a seasoned data engineering professional with deep expertise in Azure Data Factory (ADF) and related Azure services. The ideal candidate will lead end-to-end data integration, orchestration, and migration initiatives across enterprise environments. This role demands strong experience in designing robust ADF pipelines, data migration across tenants, and implementing best practices in CI/CD and data governance. Primary Responsibilities Azure Data Factory (ADF) Expertise Design, develop, and manage complex ADF pipelines for large-scale data integration and transformation Orchestrate data flows across multiple sources and sinks using ADF Optimize data movement and transformation for performance and cost-efficiency Troubleshoot and monitor ADF pipelines to ensure data reliability and accuracy Data Migration & Integration Plan and execute data migration across multiple Azure tenants using ADF and tools like AzCopy Ensure data integrity, security, and minimal downtime during migrations Implement logging and error-handling strategies for large-volume data transfers CI/CD and Azure DevOps Design and implement CI/CD pipelines using Azure Pipelines to automate deployment of ADF and related components Collaborate with development and QA teams to integrate CI/CD best practices Manage version control and release strategies for ADF assets Data Architecture & Governance Design and maintain scalable data models, databases, and warehouses in Azure Align data architecture with business requirements and Azure best practices Enforce data governance, compliance, and security standards Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
What You’ll Do Handle data: pull, clean, and shape structured & unstructured data. Manage pipelines: Airflow / Step Functions / ADF… your call. Deploy models: build, tune, and push to production on SageMaker, Azure ML, or Vertex AI. Scale: Spark / Databricks for the heavy lifting. Automate processes: Docker, Kubernetes, CI/CD, MLFlow, Seldon, Kubeflow. Collaborate effectively: work with engineers, architects, and business professionals to solve real problems promptly. What You Bring 3+ years hands-on MLOps (4-5 yrs total software experience). Proven experience with one hyperscaler (AWS, Azure, or GCP). Confidence with Databricks / Spark, Python, SQL, TensorFlow / PyTorch / Scikit-learn. Extensive experience handling and troubleshooting Kubernetes and proficiency in Dockerfile management. Prototyping with open-source tools, selecting the appropriate solution, and ensuring scalability. Analytical thinker, team player, with a proactive attitude. Nice-to-Haves Sagemaker, Azure ML, or Vertex AI in production. Dedication to clean code, thorough documentation, and precise pull requests. Skills: mlflow,ml ops,scikit-learn,airflow,mlops,sql,pytorch,adf,step functions,kubernetes,gcp,kubeflow,python,databricks,tensorflow,aws,azure,docker,seldon,spark Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Viraaj HR Solutions is a forward-thinking recruitment agency specializing in connecting talent with opportunities across various industries. Our mission is to empower individuals through meaningful employment, while fostering growth for businesses through innovative talent acquisition strategies. We value integrity, collaboration, and excellence in our operations. As part of our commitment to delivering exceptional HR solutions, we are currently seeking an Azure Data Engineer to join our client on-site in India. Role Responsibilities Design and implement data solutions on Microsoft Azure. Develop ETL processes to extract, transform, and load data efficiently. Perform data modeling and database design to support analytics and reporting. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Optimize existing data pipelines for performance and reliability. Ensure data integrity and consistency through robust validation checks. Maintain and troubleshoot data integration processes. Implement data governance and security best practices. Work with big data technologies to manage large data sets. Document all technical processes and data architecture. Utilize Azure Data Factory and other Azure services for data management. Conduct performance tuning of SQL queries and data flows. Participate in design reviews and code reviews. Stay current with Azure updates and analyze their potential impact on existing solutions. Provide support and training to junior data engineers. Qualifications Bachelor's degree in Computer Science or a related field. 3+ years of experience in data engineering or a related role. Proficiency in Azure Data Factory and Azure SQL Database. Strong knowledge of SQL and relational databases. Experience with ETL tools and processes. Familiarity with data warehousing concepts. Hands-on experience with big data technologies like Hadoop or Spark. Knowledge of Python or other scripting languages. Understanding of data modeling concepts and techniques. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Experience with data governance practices. Certifications in Azure or data engineering are a plus. Familiarity with Agile methodologies. Skills: big data technologies,azure data engineer,agile methodologies,relational databases,etl processes,sql server,python scripting,database design,azure databricks,sql,data modeling,microsoft azure,spark,azure data factory,data warehousing,data governance,python,hadoop,adf Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Data Lake and Lakehouse Implementation: Design, implement, and manage Data Lake and Lakehouse architectures. (Must have) Develop and maintain scalable data pipelines and workflows. (Must have) Utilize Azure Data Lake Services (ADLS) for data storage and management. (Must have) Knowledge on Medalion Architecture, Delta Format. (Must have) Data Processing and Transformation: Use PySpark for data processing and transformations. (Must have) Implement Delta Live Tables for real-time data processing and analytics. (Good to have) Ensure data quality and consistency across all stages of the data lifecycle. (Must have) Data Management and Governance: Employ Unity Catalog for data governance and metadata management. (Good to have) Ensure robust data security and compliance with industry standards. (Must have) Data Integration: Extract, transform, and load (ETL) data from multiple sources (Must have) including SAP (Good to have), Dynamics 365 (Good to have), and other systems. Utilize Azure Data Factory (ADF) and Synapse Analytics for data integration and orchestration. (Must have) Performance Optimization of the Jobs. (Must have) Data Storage and Access: Implement and manage Azure Data Lake Storage (ADLS) for large-scale data storage. (Must have) Optimize data storage and retrieval processes for performance and cost-efficiency. (Must have) Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements. (Must have) Provide technical guidance and mentorship to junior team members. (Good to have) Continuous Improvement: Stay updated with the latest industry trends and technologies in data engineering and cloud computing. (Good to have) Continuously improve data processes and infrastructure for efficiency and scalability. (Must have) Required Skills And Qualifications Technical Skills: Proficient in PySpark and Python for data processing and analysis. Strong experience with Azure Data Lake Services (ADLS) and Data Lake architecture. Hands-on experience with Databricks for data engineering and analytics. Knowledge of Unity Catalog for data governance. Expertise in Delta Live Tables for real-time data processing. Familiarity with Azure Fabric for data integration and orchestration. Proficient in Azure Data Factory (ADF) and Synapse Analytics for ETL and data warehousing. Experience in pulling data from multiple sources like SAP, Dynamics 365, and others. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and commitment to data accuracy and quality. Certifications Required Certification in Azure Data Engineering or relevant Azure certifications. DP203 (Must have) Certification in Databricks. Databricks certified Data Engineer Associate (Must have) Databricks certified Data Engineer Professional (Good Have) Mandatory skill sets: Azure DE, Pyspark, Databricks Preferred skill sets: Azure DE, Pyspark, Databricks Years of experience required: 5-10 Years Educational Qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Databricks Platform, Microsoft Azure Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Overview Viraaj HR Solutions is a forward-thinking recruitment agency specializing in connecting talent with opportunities across various industries. Our mission is to empower individuals through meaningful employment, while fostering growth for businesses through innovative talent acquisition strategies. We value integrity, collaboration, and excellence in our operations. As part of our commitment to delivering exceptional HR solutions, we are currently seeking an Azure Data Engineer to join our client on-site in India. Role Responsibilities Design and implement data solutions on Microsoft Azure. Develop ETL processes to extract, transform, and load data efficiently. Perform data modeling and database design to support analytics and reporting. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Optimize existing data pipelines for performance and reliability. Ensure data integrity and consistency through robust validation checks. Maintain and troubleshoot data integration processes. Implement data governance and security best practices. Work with big data technologies to manage large data sets. Document all technical processes and data architecture. Utilize Azure Data Factory and other Azure services for data management. Conduct performance tuning of SQL queries and data flows. Participate in design reviews and code reviews. Stay current with Azure updates and analyze their potential impact on existing solutions. Provide support and training to junior data engineers. Qualifications Bachelor's degree in Computer Science or a related field. 3+ years of experience in data engineering or a related role. Proficiency in Azure Data Factory and Azure SQL Database. Strong knowledge of SQL and relational databases. Experience with ETL tools and processes. Familiarity with data warehousing concepts. Hands-on experience with big data technologies like Hadoop or Spark. Knowledge of Python or other scripting languages. Understanding of data modeling concepts and techniques. Excellent problem-solving skills and attention to detail. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Experience with data governance practices. Certifications in Azure or data engineering are a plus. Familiarity with Agile methodologies. Skills: big data technologies,azure data engineer,agile methodologies,relational databases,etl processes,sql server,python scripting,database design,azure databricks,sql,data modeling,microsoft azure,spark,azure data factory,data warehousing,data governance,python,hadoop,adf Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 2+ Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Chartered Accountant Diploma Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Applications Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are seeking a talented and experienced Data Architect to join our team. The Data Architect will be responsible for designing enterprise data management frameworks, ensuring data security and compliance, implementing data management processes, build data models and strategies to support various business needs and initiatives. The ideal candidate will have a strong background in data modeling principles, database design, and data management best practices. Responsibilities Collaborate with solution architect, data engineers, business stakeholders, business analysts, and DQ testers to ensure data management and data governance framework is defined as critical components. Design and develop data models using industry-standard modeling techniques and tools. Perform data profiling, data lineage and analysis to understand data quality, structure, and relationships. Optimize data models for performance, scalability, and usability by creating optimal data storage layer. Define and enforce data modeling standards, best practices, and guidelines. Participate in data governance initiatives to ensure compliance with data management policies and standards. Work closely with database administrators and developers to implement data models in relational and non-relational database systems. Conduct data model reviews and provide recommendations for improvements. Stay updated on emerging trends and technologies in data modeling and data management Qualifications & Required Skills Full-Time bachelor’s or master’s degree in engineering/technology, computer science, information technology, or related fields. 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage. 8+ years of experience in data engineering development and support. 3+ years of experience in leading technical team of data engineers and BI engineers Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools. Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks. Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions. Strong communication, interpersonal, collaboration skills along with leadership capabilities. Ability to work effectively in a fast-paced, dynamic environment as cloud SME. Act as single point of contact for all kinds of data management related queries to make data decisions. Design and manage centralized, end-to-end data architecture solutions, such as- Data model designs, Database development standards, Implementation and management of data warehouses, Data analytics systems. Conduct continuous audits of data management system performance and refine where necessary. Identify bottlenecks, optimize queries, and implement caching mechanisms to enhance data processing speed. Work to integrate disparate data sources, including internal databases and external application programming interfaces (APIs), enabling organizations to derive insights from a holistic view of the data. Ensure data privacy measures comply with regulatory standards. Preferred Azure Data Factory (ADF), Databricks certification is a plus. Data Architect or Azure cloud Solution Architect certification is a plus. Technologies we use : Azure Data Factory, Databricks, Azure Synapse, Azure Tabular, Azure Functions, Logic Apps, Key Vault, DevOps, Python, PySpark, Scripting (PowerShell, Bash), Git, Terraform, Power BI, Snowflake Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Job title: Data Engineer About The Role: As a Junior/Senior Data Engineer, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Responsibilities: Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience: Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required: 2-4 years Education Qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills AWS Glue, Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation, Data Warehouse, Data Warehouse Indexing {+ 13 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 8 Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Applications Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage, Relationship Building, Self-Awareness {+ 4 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary A career within PwC's Oracle Services Practice, will provide you with the opportunity to help organizations use enterprise technology to achieve their digital technology goals and capitalize on business opportunities. We help our clients implement and effectively use Oracle offerings to solve their business problems and fuel success in the areas of finance operations, human capital management, supply chain management, reporting and analytics, and governance, risk and compliance. Responsibilities: Experience in Implementation, Configuration, Roll-out and Application Maintenance & Support Responsibilities & Role Good functional knowledge and understanding of standard business processes across Procure-to-Pay(P2P) & Order-to-Cash(O2C) modules of the track Exposure in Requirement Gathering, Analyze Gaps, run Design Workshop, produce proof-of-concept, provide functional solutions (work on fitment & arounds) and out-of-the-box solutions Gather localization requirements and conduct a feasibility analysis Create TO-BE process flow and analyze impacts of changes from AS-IS flows Ability to work with Client and onsite team to build and building a global solution for multi country roll outs Prepare Configuration Workbook for modules, Functional Specification for RICEF objects, Test Plans and Detailed test scripts. Configure Oracle Cloud in different environments. Perform Unit / String / End to End / Regression testing for standard and custom features along with RICEF objects. Perform Data Conversion for all major data objects through FBDI/ ADFDI / Web Service Build OTBI reports as per project requirements. Should be a very good team player and ability to work with Client and onsite team to build and building a global solution for multi country roll outs Excellent English Communication Skill in all forms Mandatory skill sets Modules – SSP, Purchase Order, Order Management, GOP Inventory, Sourcing, Procurement Contracts, Supplier Management and Supplier Qualification Management Knowledge on BPM Approval Configuration. Primary Skill: SSP, Purchase Order, Sourcing, Order Management, GOP, Procurement Contracts, Supplier Management and Supplier Qualification Management Knowledge on BPM Approval Configuration Preferred skill sets Secondary Skill set of Finance Modules- Expenses, Fixed Assets, Payables, Tax is an added advantage. Years of experience required 4-7 Yrs experience Education Qualification BE/BTech/MBA/MCA/CAs Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Chartered Accountant Diploma, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Supply Chain Management (SCM) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
6.0 - 9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Data Engineer (Azure Data Factory & PostgreSQL) Location: Bangalore Employment Type: Full-time Experience Level: 6-9 Years About the Role: We are seeking a skilled and detail-oriented Data Engineer with hands-on experience in Azure Data Factory , PostgreSQL , and other Azure ecosystem tools. The ideal candidate will be responsible for designing, developing, and maintaining data pipelines and models, ensuring high performance, scalability, and data integrity across the system. Key Responsibilities: Design, develop, and maintain robust data pipelines using Azure Data Factory . Create and manage data models in PostgreSQL , ensuring optimal data storage and retrieval. Optimize query performance and database efficiency in PostgreSQL through indexing, tuning, and performance monitoring. Map and transform data from diverse sources into coherent and efficient data models. Develop and maintain logging and monitoring mechanisms in Azure Data Factory to proactively identify and troubleshoot issues. Handle various file operations within ADF, including reading, writing, and transforming data across multiple file formats. Ensure secure and compliant operations using Azure Key Vault , Azure Data Lake , and other Azure services. Write complex SQL queries , capable of handling diverse scenarios and optimized for performance. Collaborate effectively with business stakeholders, product owners, and data architects to gather requirements and deliver scalable solutions. Implement data validation and quality checks to ensure data integrity and accuracy. (Preferred) Build and configure semantic models and reports in Power BI . Required Skills: Strong experience with Azure Data Factory , PostgreSQL , SQL , and Azure services (Key Vault, Data Lake). Solid understanding of data modeling techniques and ETL/ELT processes. Excellent problem-solving skills and ability to manage complex data scenarios. Strong communication and collaboration skills, especially working with cross-functional teams. Preferred Qualifications: Experience with Power BI for report creation and data visualization. Familiarity with DevOps practices and CI/CD in a data engineering context. Education: Bachelor’s or Master’s degree. Show more Show less
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Dear Candidate Greetings from TCS !!! TCS has been a great pioneer in feeding the fire of young Techies like you. We are a global leader in the technology arena and there's nothing that can stop us from growing together. Role: Cloud Devops Engineer (Azure) Location: Chennai Experience Range: 8 to 12 years Job Description: Good experience in Microsoft Fabric Strong understanding of DevOps processes & procedures & Tools Data Lake,Data Analysis ,Data Engineer Power BI Experience with Azure DevOps products (work item, Wiki, git, repos, pipelines, release manager) Experience with application and infrastructure operation monitoring (such as App Dynamic, Splunk, Azure Portal) and change management (such as ServiceNow) Azure Cloud experience deploy and using PaaS resources, such as ASE, SQL MI, Cosmos DB, Storage Account, AKS, ADF, etc. Hands on experience to create build & deployment automation with Application as a Code pipelines using YAML. Hands on experience to create Azure Data Factory pipelines using YAML. Knowledge in Azure infrastructure automation using PowerShell, Runbooks, and Terraform NuGet and NPM Packaging Containers/Docker, Repository manager Good communication skills (written & verbal) and Ability to present. Agile Scrum/Kanban experience Core experience in Azure services CI experience (Git, Jenkins, GitLab), Bash, PowerShell Build automation Container experience in Docker Azure DevOps CKA and CKAD Certifications Azure Developer who has worked extensively on CI image building with both Linux and Windows containers Should have the best standards knowledge on CI Image building process for both Linux and windows containers Significant experience with SaaS and web-based technologies Skilled with Continuous Integration and Continuous Deployments using Azure Devops Services. Skilled with PowerShell to automate Python, or Bash is an added advantage. Skilled with containerization platforms using Docker & Kubernetes Familiar with architecture/design patterns and re-usability concepts. Skilled with object-oriented analysis and design (OOA&D) methodology and micro-services. Skilled in SOLID design principles and TDD. Familiar with Application Security via OWASP Top 10 and common mitigation strategies. Very Familiar with source control systems (git) and Azure DevOps. Detailed knowledge of database design and object/relational database technology. Azure DevOps Implementation: Lead the design and implementation of CI/CD pipelines using Azure DevOps. Configure and manage build agents, release pipelines, and deployment environments in Azure DevOps. Establish and maintain robust CI processes to automate code builds, testing, and deployment. Integrate automated testing into CI pipelines for comprehensive code validation. Continuous Integration: Infrastructure as Code (IaC) Utilize Infrastructure as Code principles to manage and provision infrastructure components on Azure. Implement and maintain IaC templates (e.g., ARM templates) for infrastructure provisioning. Monitoring and Optimization: Implement monitoring and logging solutions to track the performance and reliability of CI/CD pipelines. Continuously optimize CI/CD processes for efficiency, speed, and resource utilization. Security and Compliance Implement security best practices within CI/CD pipelines. Ensure compliance with industry standards and regulatory requirements in CI/CD processes. Troubleshooting and Support Provide expert-level support for CI/CD-related issues. Working with Product teams to manage AZURE systems deployment, and lifecycle maintenance, including requests, determining action plans, Capacity planning, reporting, advising and parties involved. Responsible for triage and resolving service management system incidents and requests. Responsible for application monitoring, data manipulation for widgets and generating reports, problem identification and management. Responsible for system data manipulation- tuning agents and collectors to glean wanted information. Occasionally consult with individuals inside and outside of the team and provide general customer support Azure infrastructure mgmt. Create and manage check-in policies, and installation, configuration, troubleshooting and maintenance. Producing scripts for automation and report generation using Terraform, Tera grunt, Cloud formation templates, Ansible, GIT, PowerShell, Bash, Shell, python scripting, Linux and Windows operating system and scripting, azure Visual Studio Team Services. Maintain the applications within EKS, AKS, Dockers, Hub, and Docker Registry. Manage Networking protocols, network security in the cloud. Manage Cloudflare products as well as other equivalent tools. Monitoring Infrastructure. Virtual machines, Virtual Networks, autoscaling, storage, Key vault, Network Security Group, Load Balancer, Traffic Manager, Route Tables, storage accts, EFS, FSX, NetApp NAS, Recovery Services Vaults, Key Vault, Azure Backup, lambda, server less architecture components. Required Skills Azure Certified Solutions Architect or Sys Ops Administrator and Equivalent Azure certified. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server-side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy-first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms, ensuring real-time and accurate server-side event tracking. Develop and manage custom tracking solutions leveraging Azure Clean Rooms, ensuring user NFAs are respected and privacy-compliant logic is implemented. Architect and develop secure REST APIs in C# to support advanced attribution models and marketing analytics pipelines. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS), Cosmos DB, and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager, including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azure’s data security standards and enterprise privacy frameworks. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. Required Skills Strong hands-on experience with C# and building scalable APIs. Experience in implementing Meta CAPI, Google Enhanced Conversions, and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms, with experience developing custom logic and code for clean data collaborations. Proficiency with Azure Cloud technologies, especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS, and Azure security best practices. Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management, specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server-side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy-first environments like Azure Clean Rooms . This role requires strong hands-on experience in Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms, ensuring real-time and accurate server-side event tracking. Utilize OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms, ensuring user NFAs are respected, and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS), Cosmos DB, and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager, including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azure’s data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. Required Skills Strong hands-on experience in Python and building scalable APIs. Experience in implementing Meta CAPI, Google Enhanced Conversions, and other platform-specific server-side tracking APIs. Proficiency with Azure Cloud technologies, Azure Functions, ADF, Key Vault, ADLS, and Azure security best practices. Knowledge of Azure Clean Rooms, with experience developing custom logic and code for clean data collaborations. Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management, specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes. Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Key Responsibilities: Data Lake and Lakehouse Implementation: Design, implement, and manage Data Lake and Lakehouse architectures. (Must have) Develop and maintain scalable data pipelines and workflows. (Must have) Utilize Azure Data Lake Services (ADLS) for data storage and management. (Must have) Knowledge on Medalion Architecture, Delta Format. (Must have) Data Processing and Transformation: Use PySpark for data processing and transformations. (Must have) Implement Delta Live Tables for real-time data processing and analytics. (Good to have) Ensure data quality and consistency across all stages of the data lifecycle. (Must have) Data Management and Governance: Employ Unity Catalog for data governance and metadata management. (Good to have) Ensure robust data security and compliance with industry standards. (Must have) Data Integration: Extract, transform, and load (ETL) data from multiple sources (Must have) including SAP (Good to have), Dynamics 365 (Good to have), and other systems. Utilize Azure Data Factory (ADF) and Synapse Analytics for data integration and orchestration. (Must have) Performance Optimization of the Jobs. (Must have) Data Storage and Access: Implement and manage Azure Data Lake Storage (ADLS) for large-scale data storage. (Must have) Optimize data storage and retrieval processes for performance and cost-efficiency. (Must have) Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements. (Must have) Provide technical guidance and mentorship to junior team members. (Good to have) Continuous Improvement: Stay updated with the latest industry trends and technologies in data engineering and cloud computing. (Good to have) Continuously improve data processes and infrastructure for efficiency and scalability. (Must have) Required Skills And Qualifications Technical Skills: Proficient in PySpark and Python for data processing and analysis. Strong experience with Azure Data Lake Services (ADLS) and Data Lake architecture. Hands-on experience with Databricks for data engineering and analytics. Knowledge of Unity Catalog for data governance. Expertise in Delta Live Tables for real-time data processing. Familiarity with Azure Fabric for data integration and orchestration. Proficient in Azure Data Factory (ADF) and Synapse Analytics for ETL and data warehousing. Experience in pulling data from multiple sources like SAP, Dynamics 365, and others. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and commitment to data accuracy and quality. Certifications Required Certification in Azure Data Engineering or relevant Azure certifications. DP203 (Must have) Certification in Databricks. Databricks certified Data Engineer Associate (Must have) Databricks certified Data Engineer Professional (Good Have) Mandatory skill sets: Azure DE, Pyspark, Databricks Preferred skill sets: Azure DE, Pyspark, Databricks Years of experience required: 5-10 Years Educational Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Extensive experience with SQL Server, including performance tuning, optimization, and complex query writing. Proficient in Azure Data Factory (ADF) for data integration, ETL processes, and data flow management. Proficient in T-SQL for writing advanced queries, stored procedures, and functions. Knowledge of PL/SQL is a plus for handling Oracle database interactions. Good to have experience as a SQL DBA, managing database security, backups, and recovery strategies. Ability to troubleshoot and resolve database and data pipeline issues. Excellent communication skills, both verbal and written, with the ability to convey technical concepts to non-technical stakeholders. Self-motivated and proactive, with a commitment to delivering high-quality solutions. Mandatory Skill Sets SQL. PL/SQL, T-SQL, SQL DBA Preferred Skill Sets SQL. PL/SQL, T-SQL, SQL DBA Years Of Experience Required 5-10 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required Minimum 8 Years of Oracle fusion experience Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage, Relationship Building, Self-Awareness {+ 4 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Responsibilities About the role: As a Junior/Senior Data Engineer, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required Skills & Experience Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred Skill Sets Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years Of Experience Required 4-7years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
7.0 - 10.0 years
0 Lacs
Greater Chennai Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Microsoft Dynamics CRM at PwC will specialise in analysing client requirements, implementing CRM software solutions, and providing training and support for seamless integration and utilisation of Microsoft CRM applications. Working in this area, you will enable clients to optimise operational efficiency and achieve their strategic objectives. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities D365 Azure Integration Developer and Design Architect Mandatory skill set Design, develop, and deploy integration solutions which can be either API based or File based or Odata based or middleware based or any other path as appropriate for the target solution landscape. Implementing backend logic using .NET and leveraging various Azure services to enhance functionality and scalability. Key Skills - Azure Integration Services including Logic Apps, Function app, Service Bus, API management, Azure SQL, Storage account AFS/Blob/ Container, Key vault, Azure Dashboards, App registration, Alert rule etc. Key Technologies – C#, .NET, Data Structure, SQL Server/Cosmos DB and Design patterns. Ability to build relationships and become a trusted partner to the developer community, actively participating in and driving the community of practice Azure VM, VNET, Storage, Subscriptions, Security ARM template, Terraform for script-based deployment automation Project delivery methodologies like - Waterfall, Agile model using tools like Azure DevOps. IT security and compliance requirements. Configure, setup and manage CI/CD Azure pipelines for Build and Release (e.g., source control, build tool, CI server, Gated check-in, artefact, repository, etc.) Candidate must be a team player with strong communication skills and ability to manage and lead integration track in implementation projects. Ability to effectively communicate with internal and external stakeholders to remove critical blockers, contribute to design and development activities to ensure delivery as per timeline. Preferred skill set Architectural knowledge on ERP platforms preferably Microsoft Dynamics 365 Finance and Operations Azure firewall configurations and security setup Familiarity on Azure synapse analytics, Apache spark pool, Azure Data bricks, ADF for advanced data analytics and orchestration Architectural and capability framework knowledge of other middlewares like - MuleSoft, Microsoft Biztalk or SQL Server Integration Service etc Advise the integration with third party analytics and monitoring tools (e.g., Splunk, AppDynamics, SonarQube, Contrast Security, etc.) Integration with third-party analytics and monitoring tools Years of experience required: 7-10 Years Education Qualification – BE/BTech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Dynamics 365 Customer Relationship Management (CRM) Optional Skills Node.js Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Key Responsibilities: Data Lake and Lakehouse Implementation: Design, implement, and manage Data Lake and Lakehouse architectures. (Must have) Develop and maintain scalable data pipelines and workflows. (Must have) Utilize Azure Data Lake Services (ADLS) for data storage and management. (Must have) Knowledge on Medalion Architecture, Delta Format. (Must have) Data Processing and Transformation: Use PySpark for data processing and transformations. (Must have) Implement Delta Live Tables for real-time data processing and analytics. (Good to have) Ensure data quality and consistency across all stages of the data lifecycle. (Must have) Data Management and Governance: Employ Unity Catalog for data governance and metadata management. (Good to have) Ensure robust data security and compliance with industry standards. (Must have) Data Integration: Extract, transform, and load (ETL) data from multiple sources (Must have) including SAP (Good to have), Dynamics 365 (Good to have), and other systems. Utilize Azure Data Factory (ADF) and Synapse Analytics for data integration and orchestration. (Must have) Performance Optimization of the Jobs. (Must have) Data Storage and Access: Implement and manage Azure Data Lake Storage (ADLS) for large-scale data storage. (Must have) Optimize data storage and retrieval processes for performance and cost-efficiency. (Must have) Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements. (Must have) Provide technical guidance and mentorship to junior team members. (Good to have) Continuous Improvement: Stay updated with the latest industry trends and technologies in data engineering and cloud computing. (Good to have) Continuously improve data processes and infrastructure for efficiency and scalability. (Must have) Required Skills And Qualifications Technical Skills: Proficient in PySpark and Python for data processing and analysis. Strong experience with Azure Data Lake Services (ADLS) and Data Lake architecture. Hands-on experience with Databricks for data engineering and analytics. Knowledge of Unity Catalog for data governance. Expertise in Delta Live Tables for real-time data processing. Familiarity with Azure Fabric for data integration and orchestration. Proficient in Azure Data Factory (ADF) and Synapse Analytics for ETL and data warehousing. Experience in pulling data from multiple sources like SAP, Dynamics 365, and others. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and commitment to data accuracy and quality. Certifications Required Certification in Azure Data Engineering or relevant Azure certifications. DP203 (Must have) Certification in Databricks. Databricks certified Data Engineer Associate (Must have) Databricks certified Data Engineer Professional (Good Have) Mandatory skill sets: Azure DE, Pyspark, Databricks Preferred skill sets: Azure DE, Pyspark, Databricks Years of experience required: 5-10 Years Educational Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills PySpark Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
4.5 - 6.0 years
6 - 8 Lacs
Bengaluru
On-site
ROLES & RESPONSIBILITIES Primary Skills: ADF, Databricks, Log Analytics Secondary Skills: Data Warehouse, Logic Apps, Log Analytics, Datadog, Atlan, Attacama EXPERIENCE 4.5-6 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): synapse, databricks, Azure Datalake, Azure Data Factory ABOUT THE COMPANY Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.
Posted 1 month ago
12.0 - 14.0 years
7 - 9 Lacs
Bengaluru
On-site
ROLES & RESPONSIBILITIES Primary Skills: ADF/Databricks, Data Modelling Secondary Skills: Retail Dataset experience, Data Governance tools, JIRA EXPERIENCE 12-14 Years SKILLS Primary Skill: DXP Architecture Sub Skill(s): DXP Architecture Additional Skill(s): Data Architecture, databricks, DXP Architecture, AEM Architecture, Content Architecture ABOUT THE COMPANY Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. We engineer business outcomes for Fortune 500 companies and digital natives in the technology, healthcare, insurance, travel, telecom, and retail & CPG industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. We accelerate experience-led transformation in the delivery of digital platforms. Infogain is also a Microsoft (NASDAQ: MSFT) Gold Partner and Azure Expert Managed Services Provider (MSP). Infogain, an Apax Funds portfolio company, has offices in California, Washington, Texas, the UK, the UAE, and Singapore, with delivery centers in Seattle, Houston, Austin, Kraków, Noida, Gurgaon, Mumbai, Pune, and Bengaluru.
Posted 1 month ago
4.0 years
10 Lacs
Noida
On-site
At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description: In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. QA Automation Engineer As a QA Automation Engineer specializing in Data Warehousing, you will play a critical role in ensuring that our data solutions are of the highest quality. You will work closely with data engineers and analysts to develop, implement, and maintain automated testing frameworks for data validation, ETL processes, data quality, and integration. Your work will ensure that data is accurate, consistent, and performs optimally across our data warehouse systems. Responsibilities Develop and Implement Automation Frameworks : Design, build, and maintain scalable test automation frameworks tailored for data warehousing environments. Test Strategy and Execution : Define and execute automated test strategies for ETL processes, data pipelines, and database integration across a variety of data sources. Data Validation : Implement automated tests to validate data consistency, accuracy, completeness, and transformation logic. Performance Testing : Ensure that the data warehouse systems meet performance benchmarks through automation tools and load testing strategies. Collaborate with Teams : Work closely with data engineers, software developers, and data analysts to understand business requirements and design tests accordingly. Continuous Integration : Integrate automated tests into the CI/CD pipelines, ensuring that testing is part of the deployment process. Defect Tracking and Reporting : Use defect-tracking tools (e.g., JIRA) to log and track issues found during automated testing, ensuring that defects are resolved in a timely manner. Test Data Management : Develop strategies for handling large volumes of test data while maintaining data security and privacy. Tool and Technology Evaluation : Stay current with emerging trends in automation testing for data warehousing and recommend tools, frameworks, and best practices. Job Qualifications: Requirements and skills At Least 4+ Years Experience Solid understanding of data warehousing concepts (ETL, OLAP, data marts, data vault,star/snowflake schemas, etc.). Proven experience in building and maintaining automation frameworks using tools like Python, Java, or similar, with a focus on database and ETL testing. Strong knowledge of SQL for writing complex queries to validate data, test data pipelines, and check transformations. Experience with ETL tools (e.g., Matillion, Qlik Replicate) and their testing processes. Performance Testing Experience with version control systems like Git Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues. Strong communication and collaboration skills. Attention to detail and a passion for delivering high-quality solutions. Ability to work in a fast-paced environment and manage multiple priorities. Enthusiastic about learning new technologies and frameworks. Experience with the following tools and technologies are desired. QLIK Replicate Matillion ETL Snowflake Data Vault Warehouse Design Power BI Azure Cloud – Including Logic App, Azure Functions, ADF Cotality's Diversity Commitment: Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement: Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi