Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
India
Remote
Role: Data Engineer Years of experience: 5 plus years Location: Remote Notice: Only immediate joiners Min 5+ yrs of experience in relevant field. Deep understanding of Private and Public Cloud Architectures. Experience in Azure Services including Azure Data Factory (ADF) for orchestrating complex workflows. Hands-on expertise with Databricks and PySpark for big data transformation and advanced analytics. Strong emphasis and deep experience working with Relational Databases (RDBMS) including SQL Server, PostgreSQL, MySQL – with advanced SQL skills. Experience in Terraform, Kubernetes, and Service Mesh. Expertise with open-source stack technologies. Design and develop ETL pipelines using Java, Scala, or Python. Ingestion and transformation of data to/from RDBMS and NoSQL databases (e.g., Cassandra, PostgreSQL, Yugabyte DB). Job orchestration using Apache Airflow or Oozie. Adept with Agile Software Development Lifecycle and DevOps principles. Prior experience in Informatica PowerCenter or any other ETL tools. Interested candidate can drop in your profile to anuritha@prosmcloudinc.com Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
On-site
Avensys is a reputed global IT professional services company headquartered in Singapore. Our service spectrum includes enterprise solution consulting, business intelligence, business process automation and managed services. Given our decade of success, we have evolved to become one of the top trusted providers in Singapore and service a client base across banking and financial services, insurance, information technology, healthcare, retail and supply chain. * Singapore Onsite *looking for only short term notice candidates Job Description: We are looking for an experienced Oracle IAM Consultant expertise to join our dynamic team Design and Implementation: Designing and implementing IAM solutions using OIM /OIG Developing custom connectors to integrate OIM with various applications and systems. Building and configuring OIM workflows, approval policies, and entitlements. Developing custom UI components for OIM self-service pages. Skills and Experience: Experienced in an end-to-end integration of IAM Solution using Oracle Identity governance. Prior experience with requirement gathering, analysis, design, development, maintenance, and upgrades in different environments like DEV, QA, UAT, PROD. Experience with ICF Based Framework connector to integrate with target applications and perform CRUD Operations and managing roles to the Target system. Extended hands-on experience with custom code development such as Event Handlers, Validation Plugin and Schedule Tasks using Java API. Experience with Audit reports with OIM BI Publisher and customized the logo and header of the UI Screen and audit reports. Implement Oracle ADF customizations for user interfaces. Build custom Oracle SOA composites for workflows. Java Experience: Best practice based secure java development Exposure and hands on experience with REST APIs and web services Ability to re-use existing code and extend frameworks Administration and Management: Administering and managing OIM environments. Ensuring the IAM platform is secure, scalable, and supports business requirements. Monitoring the performance and health of IAM systems. Security and Compliance: Developing and enforcing IAM policies and procedures. Collaborating with security teams to address vulnerabilities. Support and Troubleshooting: Supporting end-users with access-related issues and requests. Troubleshooting and resolving technical issues related to OIM implementation. Good to Have: Hands-on experience with Oracle Access manager Good understanding of AS400 and relevant infrastructure Unix Scripting String SQL knowledge WHAT’S ON OFFER You will be remunerated with an excellent base salary and entitled to attractive company benefits. Additionally, you will get the opportunity to enjoy a fun and collaborative work environment, alongside a strong career progression. To submit your application, please apply online or email your UPDATED CV to swathi@aven-sys.com Your interest will be treated with strict confidentiality. CONSULTANT DETAILS: Consultant Name : Swathi Avensys Consulting Pte Ltd EA Licence 12C5759 Privacy Statement: Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and Avensys' privacy policy Show more Show less
Posted 1 month ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Reporting Data Engineer Join EY as a MARS Data Engineer and be at the forefront of providing and implementing innovative data insights, data products, and data services. MARS is a data platform providing custom data insights, DaaS and DaaP for a variety of EY departments and staff. We leverage software development practices to develop intricate data insights and develop data products. Your Key Responsibilities As a member of the MARS team, you will play a critical role in our mission of providing innovative data insights, the operations and support of the MARS data platform. This includes supporting customers, internal team members, and management. Operations and support include estimating, designing, developing and delivery of data products and services. You will contribute your creative solutions and knowledge to our data platform which features 2TB of mobile device data daily (300K+ devices). Our platform empowers our product managers and help enable our teams to build a better working world. As reporting engineer with the MARS team, the following activities are expected: Collaborate closely with the product manager to align activities to timelines and deadlines Proactively suggest new ideas and solutions, driving them to implementation with minimal guidance on technical delivery Provide input to the MARS roadmap and actively participate to bring it to life Collaborate with the Intune engineering team to get a clear understanding of the mobile device lifecycle and the relationship to Intune data and reporting Serve as the last level of support for all MARS data reporting questions and issues. Participate and contribute in the below activities: Customer discussions and requirement gathering sessions Application reports (daily, weekly, monthly, quarterly, annually) Custom reporting for manual reports, dashboards, exports, APIs, and semantic models Customer Service engagements Daily team meetings Work estimates and daily status Data & Dashboard monitoring & troubleshooting Automation Data management and classification Maintaining design documentation for Data schema, data models, data catalogue, and related products/services. Monitoring and integrating a variety of data sources Maintain and develop custom data quality tools Skills And Attributes For Success General Skills Analytical Ability: Strong analytical skills in supporting core technologies, particularly in managing large user bases, to effectively troubleshoot and optimize data solutions. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical stakeholders. Proficiency in English is required, with additional languages being a plus. Interpersonal Skills: Strong interpersonal skills, sound judgment, and tact to foster collaboration with colleagues and customers across diverse cultural backgrounds. Creative Problem-Solving: Ability to conceptualize innovative solutions that add value to end users, particularly in the context of mobile applications and services. Self-Starter Mentality: A proactive and self-motivated approach to work, with the ability to take initiative and drive projects forward independently. Documentation Skills: Clear and concise documentation skills, ensuring that all processes, solutions, and communications are well-documented for future reference. Organizational skills: The ability to define project plans, execute them, and manage ongoing risks and communications throughout the project lifecycle. Cross-Cultural Awareness: Awareness of and sensitivity to cross-cultural dynamics, enabling effective collaboration with global teams and clients. User Experience Focus: Passionate about improving user experience, with an understanding of how to measure, monitor, and enhance user satisfaction through feedback and analytics. To qualify for the role, you must have the following qualifications: At least three-years of experience in the following technologies and methodologies Hands-on experience in Microsoft Intune data, Mobile Device and Application Management data (MSFT APIs, Graph and IDW) Proven experience in mobile platform engineering or a related field. Strong understanding of mobile technologies and security protocols, particularly within an Intune-based environment. Experience with Microsoft Intune, including mobile device and application management. Proficient in supporting Modern Workplace tools and resources. Skilled in supporting Modern Workplace tools and resources Experience with iOS and Android operating systems. Proficient in PowerShell scripting for automation and management tasks. Ability to operate proactively and independently in a fast-paced environment. Solution oriented mindset with the capability to design and implement creative Mobile solutions and the ability to suggest and implement solutions that meet EY’s requirements Ability to work in UK working hours Specific technology skills include the following: Technical Skills Power BI - semantic models, Advanced Dashboards Power Bi Templates Intune Reporting and Intune Data Intune Compliance Intune Device Intune Policy management Intune Metrics Intune Monitoring SPLUNK data and reporting Sentinel data and reporting HR data and reporting Mobile Defender data and reporting AAD-Active Directory Data quality & data assurance Data Bricks Web Analytics Mobile Analytics Azure Data Factory Azure pipelines/synapses Azure SQL DB/Server ADF Automation Azure Kubernetes Service (KaaS) Key Vault management Azure Monitoring App Proxy & Azure Front Door data exports API Development Python, SQL, KQL, Power Apps MSFT Intune APIs, (Export, App Install) Virtual Machines SharePoint - General operations Data modeling ETL and related technologies Ideally, you’ll also have the following: Strong communication skills to effectively liaise with various stakeholders. A proactive approach to suggesting and implementing new ideas. Familiarity with the latest trends in mobile technology. Ability to explain very technical topics to non-technical stakeholders Experience in managing and supporting large mobile environments. Testing and Quality Assurance – ensure our mobile platform meets quality, performance and security standards. Implementation of new products and/or service offerings. Experience with working in a large global environment XML data formats Agile delivery Object-oriented design and programming Software development Mobile What we look for: A person that demonstrates a commitment to integrity, initiative, collaboration, efficiency and three or more years in the field of data analytics, and Intune data reporting. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required Minimum 4Years of Oracle fusion experience Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Middleware (OFM) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description We are looking for passionate and skilled Software Developers ready to take on the next big challenge—building AI-powered agentic applications for Oracle Fusion ERP. As part of the Fusion Financial Technology team, you will contribute to the design and development of next-generation enterprise agents like Ledger Agent and Payment Agent, which leverage Generative AI and Autonomous Decision-Making to simplify financial operations for our customers. Fusion Applications on Cloud is the next generation applications offering from Oracle portfolio. We at Fusion Financials team are building world class Financial Products, Common Components and Tools using the cutting edge technologies - SOA, SOAP & REST Services, BIP, ESS, ADF from Oracle Fusion middle-ware tech stack. We are looking for highly talented dynamic professionals to be part of one of its kind project “Fusion”. Come and join the most vibrant team common core team in Fusion Financials and make the difference. Responsibilities Lead the design and development of AI-powered agentic applications such as Ledger Agent and Payment Agent within the Oracle Fusion ERP suite. Architect and build scalable, cloud-native financial services using Java, SOA, REST/SOAP, ADF, and Oracle middleware technologies. Drive the integration of Generative AI techniques to enable intelligent decision-making in enterprise workflows. Define and implement microservices-based solutions using Docker, Kubernetes, and CI/CD pipelines (Jenkins, Git). Collaborate with product managers, architects, and cross-functional teams to align technical design with business goals. Mentor and guide junior developers, perform code reviews, and ensure adherence to coding and design best practices. Own complex problem-solving efforts and troubleshoot production issues with a focus on root-cause resolution. Stay ahead of technology trends, especially in AI/Gen AI, and drive innovation into existing and new application components. Champion agile methodologies and continuous improvement in development processes, tooling, and team collaboration. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Telangana, India
On-site
Job Description We are looking for passionate and skilled Software Developers ready to take on the next big challenge—building AI-powered agentic applications for Oracle Fusion ERP. As part of the Fusion Financial Technology team, you will contribute to the design and development of next-generation enterprise agents like Ledger Agent and Payment Agent, which leverage Generative AI and Autonomous Decision-Making to simplify financial operations for our customers. Fusion Applications on Cloud is the next generation applications offering from Oracle portfolio. We at Fusion Financials team are building world class Financial Products, Common Components and Tools using the cutting edge technologies - SOA, SOAP & REST Services, BIP, ESS, ADF from Oracle Fusion middle-ware tech stack. We are looking for highly talented dynamic professionals to be part of one of its kind project “Fusion”. Come and join the most vibrant team common core team in Fusion Financials and make the difference. Responsibilities Lead the design and development of AI-powered agentic applications such as Ledger Agent and Payment Agent within the Oracle Fusion ERP suite. Architect and build scalable, cloud-native financial services using Java, SOA, REST/SOAP, ADF, and Oracle middleware technologies. Drive the integration of Generative AI techniques to enable intelligent decision-making in enterprise workflows. Define and implement microservices-based solutions using Docker, Kubernetes, and CI/CD pipelines (Jenkins, Git). Collaborate with product managers, architects, and cross-functional teams to align technical design with business goals. Mentor and guide junior developers, perform code reviews, and ensure adherence to coding and design best practices. Own complex problem-solving efforts and troubleshoot production issues with a focus on root-cause resolution. Stay ahead of technology trends, especially in AI/Gen AI, and drive innovation into existing and new application components. Champion agile methodologies and continuous improvement in development processes, tooling, and team collaboration. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Hi {fullName} There is an opportunity for Azure Data Engineer(Databricks, Pyspark, Python, SQL)IN Hyderabad for which WALKIN interview is there on 24th May 25 between 9:30 AM TO 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as Azure Data Engineer 24th May 25 if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW ON 24TH MAY 25(YES/NO): We will share a mail to you by tom Night if you are shortlisted. Role Azure Data Engineer Desired Competencies (Technical/Behavioral Competency) Must-Have Minimum 4+ years of development experience in Azure. Must have “Data Warehouse / Data Lake” development experience. Must have “Azure Data Factory (ADF) & Azure SQL DB” Must have “Azure Data Bricks” experience using Python or Spark or Scala. Nice to have “Data Modelling” & “Azure Synapse” experience. Passion for Data Quality with an ability to integrate these capabilities into the deliverables. Prior use of Big Data components and the ability to rationalize and align their fit for a business case. Experience in working with different data sources - flat files, XML, JSON, Avro files and databases Experience in developing implementation plans and schedules and preparing documentation for the jobs according to the business requirements. Proven experience and ability to work with people across the organization and skilled at managing cross-functional relationships and communicating with leadership across multiple organizations Proven capabilities for strong written and oral communication skill with the ability to synthesize, simplify and explain complex problems to different audiences. Good-to-Have Nice to have Azure Data Engineer Certifications Roles & Responsibilities Ability to integrate into a project team environment and contribute to project planning activities. Lead ambiguous and complex situations to clear measurable plans. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Associate Job Description & Summary We are seeking a talented and experienced Azure Data Engineer to join our growing team. In this role, you will be responsible for designing, building, and maintaining data pipelines and solutions on the Microsoft Azure platform. You will play a pivotal role in ensuring the smooth operation and efficiency of our data infrastructure, enabling data-driven decision making across the organization. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Design, develop, and implement robust and scalable data pipelines using Azure services such as Azure Data Factory, Azure Databricks, Azure Cosmos DB, Azure SQL Database, and other relevant services. Develop and maintain data models, schema, and data quality processes. Collaborate with stakeholders to define and understand data requirements. Develop and implement data governance policies and procedures. Troubleshoot and resolve data-related issues and performance bottlenecks. Stay up-to-date on the latest Azure technologies and best practices. Contribute to the development and maintenance of documentation and technical standards. Participate in code reviews and provide technical guidance to team members. Mandatory Skill Sets Spark framework, PySpark, Azure Databricks, Azure SQL Database, ADF, Storage Account, Azure Data Explorer Preferred Skill Sets Trino, Apache Airflow Years Of Experience Required 4 to 8 years Education Qualification Bachelor's degree in computer science, engineering, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Databricks, Microsoft Azure Data Explorer, PySpark Optional Skills Apache Web Server Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less
Posted 1 month ago
6 - 11 years
6 - 16 Lacs
Bhopal, Hyderabad, Pune
Hybrid
Urgent Opening for Sr/ Azure data lead Position!!!! Greening from NewVision Software!!! Exp : Min 6yrs CTC : As per company norms NP : Max 15 days Skills required : ADF, Databricks, SQL, Python JD : Job Description Position Summary: We are seeking a talented Sr. / Lead Data Engineer with a strong background in data engineering to join our team. You will play a key role in designing, building, and maintaining data pipelines using a variety of technologies, with a focus on the Microsoft Azure cloud platform. Responsibilities: Design, develop, and implement data pipelines using Azure Data Factory (ADF) or other orchestration tools. Write efficient SQL queries to extract, transform, and load (ETL) data from various sources into Azure Synapse Analytics. Utilize PySpark and Python for complex data processing tasks on large datasets within Azure Databricks. Collaborate with data analysts to understand data requirements and ensure data quality. Hands-on experience in designing and developing Datalakes and Warehouses Implement data governance practices to ensure data security and compliance. Monitor and maintain data pipelines for optimal performance and troubleshoot any issues. Develop and maintain unit tests for data pipeline code. Work collaboratively with other engineers and data professionals in an Agile development environment. Preferred Skills & Experience: Good knowledge of PySpark & working knowledge of Python Full stack Azure Data Engineering skills (Azure Data Factory, DataBricks and Synapse Analytics) Experience with large dataset handling Hands-on experience in designing and developing Datalakes and Warehouses
Posted 1 month ago
2 - 3 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title - Data Engineer Sr.Analyst ACS SONG Management Level: Level 10 Sr. Analyst Location: Kochi, Coimbatore, Trivandrum Must have skills: Python/Scala, Pyspark/Pytorch Good to have skills: Redshift Job Summary You’ll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles And Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured→ unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional And Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture (do not remove the hyperlink) Experience: 3.5 -5 years of experience is required Educational Qualification: Graduation (Accurate educational details should capture) Show more Show less
Posted 1 month ago
6 - 10 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
TCS has been a great pioneer in feeding the fire of Young Techies like you. We are a global leader in the technology arena and there's nothing that can stop us from growing together. We are Hiring for Azure Data Engineer/ADF Developer. We are delighted to invite you for a discussion to get to know more about you and your professional experience. The interview will be in person. Venue details Date: 24-May-2025 Registration Time: 9.30 AM to 12.30 PM Location: TCS, Assotech Business Cresterra, Yamuna Tower, VI, Plot No.22, Noida-Greater Noida Expy, Sector 135, Noida-201301. Job Description Role: Azure Data Engineer/ADF Developer Experience: 6 to 10 years Location: Noida Required Technical Skill Set: Azure Data Factory Mandatory Technical Skill Set: · Azure Data Factory · SQL- Advance · SSIS · Data Bricks · SDFS · Azure Data Factory (ADF) pipelines and Polybase · Good knowledge of Azure Storage – including Blob Storage, Data Lake, Azure SQL, Data Factory V2, Databricks · Can work on streaming analytics and various data services in Azure like Data flow, etc · Ability to develop extensible, testable and maintainable code · Good understanding of the challenges of enterprise software development · Track record of delivering high volume, low latency distributed software solutions · Experience of working in Agile teams · Experience of the full software development lifecycle including analysis, design, implementation, testing and support · Experience of mentoring more junior developers and directing/organizing the work of team Good to have skills: Experience of Datawarehouse applications. Experience in TTH Domain Projects · Knowledge of Azure DevOps is desirable. · Knowledge of CI/CD and DevOps practices is an advantage. Desired Competencies (Technical/Behavioral Competency) Must-Have: Good know how of SDFS, Azure Data Factory (ADF) pipelines and Polybase · Good knowledge of Azure Storage – including Blob Storage, Data Lake, Azure SQL, Data Factory V2, Databricks · Can work on streaming analytics and various data services in Azure like Data flow, etc. · Client facing, Technical Role, Assertive, Team Member skills. · Good Communication skills both written and spoken Good-to-Have: Experience of Datawarehouse applications Experience in TTH Domain Projects · Knowledge of Azure DevOps is desirable. · Knowledge of CI/CD and DevOps practices is an advantage. Responsibility of / Expectations from the Role: Ensure the accuracy of the deliverables through quality assurance practice 6+ years of experience in related fields with strong development background using Azure Data Engineering and Azure PaaS services. Show more Show less
Posted 1 month ago
8 - 10 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Experience- 8-10 years Location- Pune, Mumbai, Bangalore, Noida, Chennai, Coimbatore, Hyderabad JD- Databricks with Data Scientist experience · 4 years of relevant work experience as a data scientist · Minimum 2 years of experience in Azure Cloud using Databricks Services, PySpark, Natural Language API, MLflow · Experience designing and building statistical forecasting models. · Experience in Ingesting data from API and Databases · Experience in Data Transformation using PySpark and SQL · Experience designing and building machine learning models. · Experience designing and building optimization models., including expertise with statistical data analysis · Experience articulating and translating business questions and using statistical techniques to arrive at an answer using available data. · Demonstrated skills in selecting the right statistical tools given a data analysis problem. Effective written and verbal communication skills. · Skillset: Python, Pyspark, Databricks, MLflow, ADF Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role Overview We are seeking a highly skilled Sr. ADF Data Engineer with strong expertise in Azure Data Factory (ADF), Data Warehousing, and SQL. The ideal candidate will bring robust ETL experience, a deep understanding of data warehousing concepts, and a knack for database performance optimization. A proactive, problem-solving mindset and excellent client communication skills are critical to succeed in this role. Role Responsibilities Design, develop, and maintain ETL processes using Azure Data Factory Collaborate with cross-functional teams to gather requirements and deliver efficient data solutions Optimize and tune data workflows for performance and scalability Create and maintain comprehensive documentation for all data engineering processes Assist in troubleshooting and resolving data-related issues Ensure data quality and integrity across various data sources Required Technical Skill Set ADF (Azure Data Factory): Strong hands-on experience with pipelines, data flows, and integrations SQL: Expertise in writing, optimizing queries, stored procedures, and packages ETL: Comprehensive experience in Extract, Transform, Load processes across multiple sources (SQL Server, Synapse, etc.) Skills: azure data factory (adf),documentation,etl,data warehousing,performance optimization,sql,data,data quality,troubleshooting Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
Remote
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Warehouse Professionals in the following areas : Job Description Senior Data Engineer As a Senior Data Engineer, you will support the European World Area using the Windows & Azure suite of Analytics & Data platforms. The focus of the role is on the technical aspects and implementation of data gathering, integration and database design. We look forward to seeing your application! In This Role, Your Responsibilities Will Be Data Ingestion and Integration: Collaborate with Product Owners and analysts to understand data requirements & design, develop, and maintain data pipelines for ingesting, transforming, and integrating data from various sources into Azure Data Services. Migration of existing ETL packages: Migrate existing SSIS packages to Synapse pipelines Data Modelling: Assist in designing and implementing data models, data warehouses, and databases in Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Data Transformation: Develop ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or other relevant tools to prepare data for analysis and reporting. Data Quality and Governance: Implement data quality checks and data governance practices to ensure the accuracy, consistency, and security of data assets. Monitoring and Optimization: Monitor and optimize data pipelines and workflows for performance, scalability, and cost efficiency. Documentation: Maintain comprehensive documentation of processes, including data lineage, data dictionaries, and pipeline schedules. Collaboration: Work closely with cross-functional teams, including data analysts, data scientists, and business stakeholders, to understand their data needs and deliver solutions accordingly. Azure Services: Stay updated on Azure data services and best practices to recommend and implement improvements in our data architecture and processes. For This Role, You Will Need 3-5 years of experience in Data Warehousing with On-Premises or Cloud technologies Strong practical experience of Synapse pipelines / ADF. Strong practical experience of developing ETL packages using SSIS. Strong practical experience with T-SQL or any variant from other RDBMS. Graduate degree educated in computer science or a relevant subject. Strong analytical and problem-solving skills. Strong communication skills in dealing with internal customers from a range of functional areas. Willingness to work flexible working hours according to project requirements. Technical documentation skills. Fluent in English. Preferred Qualifications That Set You Apart Oracle PL/SQL. Experience in working on Azure Services like Azure Synapse Analytics, Azure Data Lake. Working experience with Azure DevOps paired with knowledge of Agile and/or Scrum methods of delivery. Languages: French, Italian, or Spanish would be an advantage. Agile certification. Who You Are You understand the importance and interdependence of internal customer relationships. You seek out experts and innovators to learn about the impact emerging technologies might have on your business. You focus on priorities and set stretch goals. Our Offer to You We understand the importance of work-life balance and are dedicated to supporting our employees' personal and professional needs. From competitive benefits plans and comprehensive medical care to equitable opportunities for growth and development we strive to create a workplace that is supportive and rewarding. Depending on location, our flexible work from home policy allows you to make the best of your time, by combining quiet home office days with collaborative experiences in the office so that you can personalize your work-life mix. Moreover, our global volunteer employee resource groups will empower you to connect with peers that share the same interest, promote diversity and inclusion, and positively contribute to communities around us. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Show more Show less
Posted 1 month ago
0 years
0 Lacs
Greater Hyderabad Area
On-site
Centroid is looking for a member to join our team. As a member, you will be coordinating with the clients. An ideal candidate should possess knowledge of various technologies related to Oracle Applications with brief practical experience in the Technical path. You will be responsible for working along with the internal team members in the India office. Job Description: Min 10 years of experience in Oracle Apps Technical. Creating test plans, test cases, test scripts and performs functional testing. Work closely with various Business Partners to deliver high-quality application solutions. Write detailed technical design documents. Must have Upgrade experience Interacting with Oracle via their formal Metalink SR process in order to secure assistance and solutions for problems. Conduct and or participate in requirement/analysis sessions. Must be able to work with third-party systems and Perform modifications including EBusiness suite changes and the maintenance of various application interfaces. Should have EBS upgrade knowledge and also CEMLI objects development including integration and conversions experience in Finance, Manufacturing, Supply Chain and HR. Ability to work independently Required Skillset: Must have strong technical experience on SQL, PL/SQL, OTBI/BI, and XML Publisher reports, workflow, Unix, and Oracle Applications Framework (OAF and ADF). SOAP & REST APIs; Data conversion using FBDI. Development using Sandbox, FRS Reporting, Integrations with 3rd parties, Security console Management/SSO Effective team player with organization excellent communication skills (Written and Verbal) Must be able to handle independently with business users and external users and responsible for design, development, Testing support, production deployment, and production support. Excellent and communications skills. Preferred Skillset: An advantage to having Java Skillset Candidates with a 1-month notice period are preferred. Candidates willing to join immediately are preferred. Education Requirement: Bachelor's degree in Computer Science, Business Information Systems, or Computer Information Systems or equivalent work experience Show more Show less
Posted 1 month ago
0 years
0 Lacs
Thiruvananthapuram, Kerala, India
On-site
What you'll do... Manual and automation testing. Need to handle calls with onsite counter parts. Attend scrum ceremonies. Need to support production, during season time. Co-ordinate with business on quality. Collaboration with integrated system teams.. What you'll bring to the team... Strong knowledge in database testing along with Playwright automation. Good knowledge in agile framework. Good communication skill Must have Skills: Playwright, SQL, ADF, Azure Dev Ops Good to have Skills: Selenium, Agile Framework Why work for us At H&R Block, we understand that passion and creativity are the keys to true innovation. We provide an environment that is both fun and challenging, allowing our employees to grow and reach their full potential. We are a Great Place to Work certified company and is in Top 10 best workplaces for Women in India. We are the No.1 Best workplace in India among Mid-size companies. (Certified by GPTW - Jul 2024). H&R Block has always remained committed to being guided by a mission that is understood, embraced and pursued by the entire organization: To help our clients achieve their financial objectives by serving as their tax and financial partner. By combining the knowledge of the highly trained professionals with cognitive computing technology, we’re offering our clients the most personalized tax experience ever. H&R Block offers both Retail Tax and Online Tax services. The Global Technology Center of H&R Block India came to existence in October 2017 in Trivandrum, Kerala. We started as a small company with 5 employees and 4 years down the lane we are a proud 1000+-member workforce innovating and reinventing the Software Industry culture. Follow our LinkedIn page for latest updates/news: Linkedin Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary: The metrics insights and analytics team is responsible for building dashboards and analytical solutions using AI ML based on requirements from business . Provide predictive and prescriptive analytics based using various delivery execution parameters and give actionable insights to users. Automate processes using new age machine learning algorithms. K ey Roles and Responsibilities: C onceptualize, maintain, automate dashboards as per the requirementsA utomation of existing processes to improve productivity and time to market E nable decision making and action plan identification through Metrics analytics C onduct training and presentations C onnect with various stakeholders to understand business problems and provide solutions B ring new age solutions and techniques into the way of working S k ills:M i nimum 3-7 years of work experience on power BI dashboards/ TABLEAU and python Mi nimum 3-7 years of work experience on AI/ML development St rong Analytical skills, adept in solutioning & problem solving, Inclination towards numbersEx perience of working on Text analytics, NLPEx perienced in data cleansing, pre-processing data and exploration data analysisKn owledge on Azure ADF, excel MACRO, RPA will be an advantageAb le to perform feature engineering, normalize data and build correlation mapsPr oficient in SQLHa nd-on experience in model operationalization and pipeline managementCa pable of working with global teamsGo od presentation and training skills L T IMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree — a Larsen & Toubro Group company — combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit www.ltimindtree.com https://www.ltimindtree.com/ . D E I Statement: LT IMindtree is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, ethnicity, nationality, gender, gender-identity, gender expression, language, age, sexual orientation, religion, marital status, veteran status, socio-economic status, disability, or any other characteristic protected by applicable law. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: Pyspark Developer Location: Hyderabad/ Chennai Duration: Full time Job Description: Key Skills – Pyspark, Spark, Python, Hive, SQL Job Descriptions: - 6+ Year experience as Data Engineer with experience in building ETL/Data pipelines with cloud technologies - 4+ years extensive experience in designing, developing, and implementing scalable data pipelines using Databricks to ingest, transform, and store structured and unstructured data. - 4+ years' experience programming languages such as Python/Pyspark and query languages like SQL - 4+ years' experience in building metadata driven data ingestion pipelines using ADF - 4+ years' experience in analysing, optimizing, and tuning existing data pipelines for performance, reliability, and efficiency - 2+ years' experience in implementing ML Ops practices to streamline the deployment and management of machine learning models. - 2+ years' experience in utilizing Apache Airflow for job orchestration and workflow management - Familiarity with CI/CD tools and practices for automating the deployment of data engineering solutions. - Experience in collaborating with data scientists, analysts, and other stakeholders to understand business requirements and translate them into technical solutions - Knowledge/Experience in implementing security measures and standard processes to ensure data privacy and compliance with regulatory standards - In-depth knowledge of data engineering concepts, ETL processes, and data architecture principles. Thanks, Siva Show more Show less
Posted 1 month ago
2 - 8 years
0 Lacs
Bengaluru, Karnataka
Work from Office
Experience- 8-10 years Location- Pune, Mumbai, Bangalore, Noida, Chennai, Coimbatore, Hyderabad JD- · 4 years of relevant work experience as a data scientist · Minimum 2 years of experience in Azure Cloud using Databricks Services, PySpark, Natural Language API, MLflow · Experience designing and building statistical forecasting models. · Experience in Ingesting data from API and Databases · Experience in Data Transformation using PySpark and SQL · Experience designing and building machine learning models. · Experience designing and building optimization models., including expertise with statistical data analysis · Experience articulating and translating business questions and using statistical techniques to arrive at an answer using available data. · Demonstrated skills in selecting the right statistical tools given a data analysis problem. Effective written and verbal communication skills. · Skillset: Python, Pyspark, Databricks, MLflow, ADF Job Types: Full-time, Permanent Pay: From ₹1,500,000.00 per year Schedule: Fixed shift Application Question(s): How soon you can join? What is your current CTC? What is your expected CTC? What is your current location? How many years of experience do you have in Databricks? How many years of experience do you have in Python, Pyspark? How many years of experience do have as a Data Scientist? Experience: total: 8 years (Required) Work Location: In person
Posted 1 month ago
3 years
0 Lacs
Pune, Maharashtra, India
On-site
Exp: 6 - 14 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills:Python, pyspark, Azure Data Factory, snowflake, snowpipe, snowsql, Snowsight, Snowpark, ETL and SQL. Snowpro certified is plus Architect Exp Mandatory Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: python,snowpro,sql,azure data factory,azure,snowpipe,neo4j,skills,data engineering,snowflake,terraform,nosql,circleci,git,data management,unix shell scripting,pl/sql,data warehouse,data bricks,pipelines,cassandra,snowsql,architects,rdbms,databricks,data,projects,pyspark,adf,snowsight,etl,snowpark,mongodb Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Reporting Data Engineer Join EY as a MARS Data Engineer and be at the forefront of providing and implementing innovative data insights, data products, and data services. MARS is a data platform providing custom data insights, DaaS and DaaP for a variety of EY departments and staff. We leverage software development practices to develop intricate data insights and develop data products. Your Key Responsibilities As a member of the MARS team, you will play a critical role in our mission of providing innovative data insights, the operations and support of the MARS data platform. This includes supporting customers, internal team members, and management. Operations and support include estimating, designing, developing and delivery of data products and services. You will contribute your creative solutions and knowledge to our data platform which features 2TB of mobile device data daily (300K+ devices). Our platform empowers our product managers and help enable our teams to build a better working world. As reporting engineer with the MARS team, the following activities are expected: Collaborate closely with the product manager to align activities to timelines and deadlines Proactively suggest new ideas and solutions, driving them to implementation with minimal guidance on technical delivery Provide input to the MARS roadmap and actively participate to bring it to life Collaborate with the Intune engineering team to get a clear understanding of the mobile device lifecycle and the relationship to Intune data and reporting Serve as the last level of support for all MARS data reporting questions and issues. Participate and contribute in the below activities: Customer discussions and requirement gathering sessions Application reports (daily, weekly, monthly, quarterly, annually) Custom reporting for manual reports, dashboards, exports, APIs, and semantic models Customer Service engagements Daily team meetings Work estimates and daily status Data & Dashboard monitoring & troubleshooting Automation Data management and classification Maintaining design documentation for Data schema, data models, data catalogue, and related products/services. Monitoring and integrating a variety of data sources Maintain and develop custom data quality tools General Skills Skills and attributes for success Analytical Ability: Strong analytical skills in supporting core technologies, particularly in managing large user bases, to effectively troubleshoot and optimize data solutions. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical stakeholders. Proficiency in English is required, with additional languages being a plus. Interpersonal Skills: Strong interpersonal skills, sound judgment, and tact to foster collaboration with colleagues and customers across diverse cultural backgrounds. Creative Problem-Solving: Ability to conceptualize innovative solutions that add value to end users, particularly in the context of mobile applications and services. Self-Starter Mentality: A proactive and self-motivated approach to work, with the ability to take initiative and drive projects forward independently. Documentation Skills: Clear and concise documentation skills, ensuring that all processes, solutions, and communications are well-documented for future reference. Organizational skills: The ability to define project plans, execute them, and manage ongoing risks and communications throughout the project lifecycle. Cross-Cultural Awareness: Awareness of and sensitivity to cross-cultural dynamics, enabling effective collaboration with global teams and clients. User Experience Focus: Passionate about improving user experience, with an understanding of how to measure, monitor, and enhance user satisfaction through feedback and analytics. To qualify for the role, you must have the following qualifications: At least three-years of experience in the following technologies and methodologies Hands-on experience in Microsoft Intune data, Mobile Device and Application Management data (MSFT APIs, Graph and IDW) Proven experience in mobile platform engineering or a related field. Strong understanding of mobile technologies and security protocols, particularly within an Intune-based environment. Experience with Microsoft Intune, including mobile device and application management. Proficient in supporting Modern Workplace tools and resources. Skilled in supporting Modern Workplace tools and resources Experience with iOS and Android operating systems. Proficient in PowerShell scripting for automation and management tasks. Ability to operate proactively and independently in a fast-paced environment. Solution oriented mindset with the capability to design and implement creative Mobile solutions and the ability to suggest and implement solutions that meet EY’s requirements Ability to work in UK working hours Specific technology skills include the following: Technical Skills Power BI - semantic models, Advanced Dashboards Power Bi Templates Intune Reporting and Intune Data Intune Compliance Intune Device Intune Policy management Intune Metrics Intune Monitoring SPLUNK data and reporting Sentinel data and reporting HR data and reporting Mobile Defender data and reporting AAD-Active Directory Data quality & data assurance Data Bricks Web Analytics Mobile Analytics Azure Data Factory Azure pipelines/synapses Azure SQL DB/Server ADF Automation Azure Kubernetes Service (KaaS) Key Vault management Azure Monitoring App Proxy & Azure Front Door data exports API Development Python, SQL, KQL, Power Apps MSFT Intune APIs, (Export, App Install) Virtual Machines SharePoint - General operations Data modeling ETL and related technologies Ideally, you’ll also have the following: Strong communication skills to effectively liaise with various stakeholders. A proactive approach to suggesting and implementing new ideas. Familiarity with the latest trends in mobile technology. Ability to explain very technical topics to non-technical stakeholders Experience in managing and supporting large mobile environments. Testing and Quality Assurance – ensure our mobile platform meets quality, performance and security standards. Implementation of new products and/or service offerings. Experience with working in a large global environment XML data formats Agile delivery Object-oriented design and programming Software development Mobile What we look for: A person that demonstrates a commitment to integrity, initiative, collaboration, efficiency and three or more years in the field of data analytics, and Intune data reporting. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Data Engineer Department: Engineering Type of Position: Full Time About us: Arrise Solutions India Pvt. Ltd. is a leading content provider to the iGaming and Betting Industry, offering a multi-product portfolio that is innovative, regulated and mobile-focused. We strive to create the most engaging and evocative experience for customers globally across a range of products, including slots, live casino, sports betting, virtual sports and bingo. We are seeking a talented and experienced Data Engineer that will work in a global team of Data Scientists delivering their pipelines into production in the most efficient way possible. You will also implement monitoring and alert systems. The ideal candidate should have in-depth knowledge and experience of Python, Azure/AWS, data storage, Data Pipelines. Key Responsibilities Create and manage ETL workflows using Python and relevant libraries (e.g., Pandas, NumPy) for high-volume data processing Monitor and optimize data workflows to reduce latency, maximize throughput, and ensure high-quality data availability. Develop REST API integrations and Python scripts to automate data exchanges with internal systems and BI dashboards. Implement validation processes and address anomalies or performance bottlenecks in real time. Design, develop and maintain Data Engineering pipelines for Machine Learning projects, ensuring high reliability and scalability. Collaborate with cross-functional teams across data Science and engineering to come up with solutions to complex problem statements. Automate existing workflows within Data Science team Required Skills And Qualifications Bachelor’s or master’s degree in computer science, Engineering, or a related field. Advanced Python proficiency with data libraries (Pandas, NumPy, etc.). Deep understanding of ETL / Reporting / Cloud (Azure) / DS technologies 3–5 years of professional experience in data engineering, ETL development, or similar roles. Experience in Azure Data Factory, Databricks, Azure data lake and Azure SQL Server. Configuration and Deployment of ADF packages. Experience working with SQL databases (e.g., MySQL, PostgreSQL) and NoSQL solutions (e.g., MongoDB). Experience with version control (Git) and continuous integration practices. Prior experience in handling very large datasets across different business functions. Excellent problem-solving, analytical, and communication skills. PREFERRED QUALIFICATIONS: Extensive experience with Azure ecosystem, particularly Azure Data Engineering and Machine Learning. Experience developing computer vision, text, audio, and/or tabular data models. Strong proficiency in Gitlab CI, Jenkins, Grafana, Docker. Excellent software engineering skills in API design and development, and concurrency design skills. If you are a skilled Data Engineer who has passion to work in a fast-paced environment, have an eye for details and ready to experiment new things, we encourage you to apply and be part of our dynamic and innovative team and organization. What We Offer Competitive compensation depending on experience Opportunities for professional and personal development Opportunities to progress within a dynamic team. Chance to work with close and collaborative colleagues Comprehensive health coverage OUR VALUES PERSISTENCE We never give up and are determined to be the best at what we do. RESPECT We value and respect our clients, their players, and our team members; promoting professionalism, integrity and fairness without compromise. OWNERSHIP We take ownership of our work and consistently deliver in a reliable manner; always providing the highest level of quality. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Position The Business Analyst - Operations Performance is part of the Technical Services organization that sits within the Chevron ENGINE Center and is responsible for delivering data solutions that meet the needs of Chevron's Asset Management, Production Accounting and Operational workflows. This role will oversee the development of data products from start to finish to ensure they meet customer expectations. The Business Analyst - Operations Performance will be the face of the Data team and the voice of the customer for the development teams. Key Responsibilities Product Vision and Strategy: Develop and communicate a clear product vision and roadmap. Define product goals and key performance indicators (KPIs). Translate business objectives into actionable product requirements. Product Backlog Management Create and maintain a prioritized product backlog. Define user stories, epics, and acceptance criteria for backlog items. Prioritize features based on business value and customer needs. Data Product Development Ensure alignment with Chevron's architectural guidelines by leveraging architecture guidance. Ensure the best practice data product development lifecycle is adhered to by the team. Collaboration And Communication Facilitate communication between the development team, stakeholders, and customers. Conduct user research and gather customer feedback. Present product updates and roadmap to stakeholders. Agile Development Process Participate in sprint planning, reviews, and retrospectives. Collaborate with the development team to ensure sprint goals are met. Make timely decisions to address issues and adapt to changing priorities. Quality Assurance Ensure product quality by reviewing deliverables against acceptance criteria. Identify and address potential risks and issues. Learning & Development Opportunities Exposure to functional workflows in Operations, Production Engineering, Facilities Engineering. Required Qualifications Bachelor’s degree in a related engineering discipline (mechanical, chemical, etc.) (B.E./B.Tech.) or computer science from a deemed/recognized (AICTE) university Experience in being a liaison between technical teams and business stakeholders. Critical thinking and practical problem solving. Understanding of data management, data storage, and data infrastructure. Demonstratable experience in SQL querying and modern data warehousing. Preferred Qualifications 5+ years of experience as a Technical Product Owner or Project Manager in a Data Management space. 3+ years of experience in development of data products in a cloud environment. Good understanding of O&G business and business workflows. Azure cloud and Databricks experience. Outcome-focused attitude. High degree of technical acumen in SQL, Spark, ADF, Databricks, Power BI. Chevron ENGINE supports global operations, supporting business requirements across the world. Accordingly, the work hours for employees will be aligned to support business requirements. The standard work week will be Monday to Friday. Working hours are 8:00am to 5:00pm or 1.30pm to 10.30pm. Chevron participates in E-Verify in certain locations as required by law. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Reporting Data Engineer Join EY as a MARS Data Engineer and be at the forefront of providing and implementing innovative data insights, data products, and data services. MARS is a data platform providing custom data insights, DaaS and DaaP for a variety of EY departments and staff. We leverage software development practices to develop intricate data insights and develop data products. Your Key Responsibilities As a member of the MARS team, you will play a critical role in our mission of providing innovative data insights, the operations and support of the MARS data platform. This includes supporting customers, internal team members, and management. Operations and support include estimating, designing, developing and delivery of data products and services. You will contribute your creative solutions and knowledge to our data platform which features 2TB of mobile device data daily (300K+ devices). Our platform empowers our product managers and help enable our teams to build a better working world. As reporting engineer with the MARS team, the following activities are expected: Collaborate closely with the product manager to align activities to timelines and deadlines Proactively suggest new ideas and solutions, driving them to implementation with minimal guidance on technical delivery Provide input to the MARS roadmap and actively participate to bring it to life Collaborate with the Intune engineering team to get a clear understanding of the mobile device lifecycle and the relationship to Intune data and reporting Serve as the last level of support for all MARS data reporting questions and issues. Participate and contribute in the below activities: Customer discussions and requirement gathering sessions Application reports (daily, weekly, monthly, quarterly, annually) Custom reporting for manual reports, dashboards, exports, APIs, and semantic models Customer Service engagements Daily team meetings Work estimates and daily status Data & Dashboard monitoring & troubleshooting Automation Data management and classification Maintaining design documentation for Data schema, data models, data catalogue, and related products/services. Monitoring and integrating a variety of data sources Maintain and develop custom data quality tools General Skills Skills and attributes for success Analytical Ability: Strong analytical skills in supporting core technologies, particularly in managing large user bases, to effectively troubleshoot and optimize data solutions. Communication Skills: Excellent written and verbal communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical stakeholders. Proficiency in English is required, with additional languages being a plus. Interpersonal Skills: Strong interpersonal skills, sound judgment, and tact to foster collaboration with colleagues and customers across diverse cultural backgrounds. Creative Problem-Solving: Ability to conceptualize innovative solutions that add value to end users, particularly in the context of mobile applications and services. Self-Starter Mentality: A proactive and self-motivated approach to work, with the ability to take initiative and drive projects forward independently. Documentation Skills: Clear and concise documentation skills, ensuring that all processes, solutions, and communications are well-documented for future reference. Organizational skills: The ability to define project plans, execute them, and manage ongoing risks and communications throughout the project lifecycle. Cross-Cultural Awareness: Awareness of and sensitivity to cross-cultural dynamics, enabling effective collaboration with global teams and clients. User Experience Focus: Passionate about improving user experience, with an understanding of how to measure, monitor, and enhance user satisfaction through feedback and analytics. To qualify for the role, you must have the following qualifications: At least three-years of experience in the following technologies and methodologies Hands-on experience in Microsoft Intune data, Mobile Device and Application Management data (MSFT APIs, Graph and IDW) Proven experience in mobile platform engineering or a related field. Strong understanding of mobile technologies and security protocols, particularly within an Intune-based environment. Experience with Microsoft Intune, including mobile device and application management. Proficient in supporting Modern Workplace tools and resources. Skilled in supporting Modern Workplace tools and resources Experience with iOS and Android operating systems. Proficient in PowerShell scripting for automation and management tasks. Ability to operate proactively and independently in a fast-paced environment. Solution oriented mindset with the capability to design and implement creative Mobile solutions and the ability to suggest and implement solutions that meet EY’s requirements Ability to work in UK working hours Specific technology skills include the following: Technical Skills Power BI - semantic models, Advanced Dashboards Power Bi Templates Intune Reporting and Intune Data Intune Compliance Intune Device Intune Policy management Intune Metrics Intune Monitoring SPLUNK data and reporting Sentinel data and reporting HR data and reporting Mobile Defender data and reporting AAD-Active Directory Data quality & data assurance Data Bricks Web Analytics Mobile Analytics Azure Data Factory Azure pipelines/synapses Azure SQL DB/Server ADF Automation Azure Kubernetes Service (KaaS) Key Vault management Azure Monitoring App Proxy & Azure Front Door data exports API Development Python, SQL, KQL, Power Apps MSFT Intune APIs, (Export, App Install) Virtual Machines SharePoint - General operations Data modeling ETL and related technologies Ideally, you’ll also have the following: Strong communication skills to effectively liaise with various stakeholders. A proactive approach to suggesting and implementing new ideas. Familiarity with the latest trends in mobile technology. Ability to explain very technical topics to non-technical stakeholders Experience in managing and supporting large mobile environments. Testing and Quality Assurance – ensure our mobile platform meets quality, performance and security standards. Implementation of new products and/or service offerings. Experience with working in a large global environment XML data formats Agile delivery Object-oriented design and programming Software development Mobile What we look for: A person that demonstrates a commitment to integrity, initiative, collaboration, efficiency and three or more years in the field of data analytics, and Intune data reporting. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description About the role: As a Senior Data Engineer, you will be responsible for building and supporting large-scale data architectures that provide information to downstream systems and business users. We are seeking an innovative and experienced individual who can aggregate and organize data from multiple sources to streamline business decision-making. In your role, you will collaborate closely with Data Engineer Leads and partners to establish and maintain data platforms that support front-end analytics. Your contributions will inform Takeda’s dashboards and reporting, providing insights to stakeholders throughout the business. In this role, you will be a part of the Digital Insights and Analytics team. This team drives business insights through IT data analytics techniques such as pattern recognition, AI/ML, and data modelling, to analyse and interpret the organization’s data with the purpose of drawing conclusions about information and trends. This role will work closely with the Tech Delivery Lead and Data Engineer Junior, both located in India. This role will align to the Data & Analytics chapter of the ICC. This position will be part of PDT Business Intelligence pod and will report to Data Engineering Lead. How you will contribute: Develop and maintain scalable data pipelines, in line with ETL principles, and build out new integrations, using AWS/Azure native technologies, to support continuing increases in data source, volume, and complexity. Define data requirements, gather, and mine data, while validating the efficiency of data tools in the Big Data Environment. Lead the evaluation, implementation and deployment of emerging tools and processes to improve productivity. Implement processes and systems to provide accurate and available data to key stakeholders, downstream systems, and business processes. Partner with Business Analysts and Solution Architects to develop technical architectures for strategic enterprise projects and initiatives. Coordinate with Data Scientists to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling. Mentor and coach junior Data Engineers on data standards and practices, promoting the values of learning and growth. Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions. Minimum Requirements/Qualifications: Bachelor's degree in engineering, Computer Science, Data Science, or related field 5-9 years of experience in software development, data science, data engineering, ETL, and analytics reporting development Experience in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines. Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc. Proven track record of designing and implementing complex data solutions Demonstrated understanding and experience using: Data Engineering Programming Languages (i.e., Python, SQL) Distributed Data Framework (e.g., Spark) Cloud platform services (AWS/ Azure preferred) Relational Databases DevOps and continuous integration AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services Azure knowledge on services like ADF, ADLS, etc. Knowledge of Data lakes, Data warehouses Databricks/Delta Lakehouse architecture Code management platforms like Github/ Gitlab/ etc., Understanding of database architecture, Data modelling concepts and administration. Handson experience of Spark Structured Streaming for building real-time ETL pipelines. Utilizes the principles of continuous integration and delivery to automate the deployment of code changes to elevate environments, fostering enhanced code quality, test coverage, and automation of resilient test cases. Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals. Strong organizational skills with the ability to work multiple projects simultaneously and operate as a leading member across globally distributed teams to deliver high-quality services and solutions. Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners Strong problem solving and troubleshooting skills Ability to work in a fast-paced environment and adapt to changing business priorities Preferred requirements: Master's degree in engineering specialized in Computer Science, Data Science, or related field Demonstrated understanding and experience using: Knowledge in CDK Experience in IICS Data Integration tool Job orchestration tools like Tidal/Airflow/ or similar Knowledge on No SQL Proficiency in leveraging the Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous. Databricks Certified Data Engineer Associate AWS/Azure Certified Data Engineer EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
22645 Jobs | Dublin
Wipro
12405 Jobs | Bengaluru
EY
8519 Jobs | London
Accenture in India
7136 Jobs | Dublin 2
Uplers
6955 Jobs | Ahmedabad
Amazon
6685 Jobs | Seattle,WA
IBM
6478 Jobs | Armonk
Oracle
6281 Jobs | Redwood City
Muthoot FinCorp (MFL)
5249 Jobs | New Delhi
Capgemini
4637 Jobs | Paris,France