Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
3 - 7 Lacs
Tirodi, Mumbai
Work from Office
Job Description Overview - At Shaadi, we always put our users first. We start by looking at things from the user s perspective and end by evaluating how the solution has impacted the user. We are looking for People who are continuously adapting to new technologies and excited to work on products that influence millions of people every day. The Shaadi.com Android and iOS mobile applications are used by millions of people around the world and are some of India s best known and most loved applications and we re looking for someone to lead the engineering teams that build these apps. Role - We are looking for a Software Engineer iOS. This is a front-end role, but not limited to it. You will be learning a lot about core iOS development along with other mobile technologies too. Also, we believe in extreme ownership! And to be honest, everyone loves working with kind and smart people. We are building a kick-ass team with humble and empathetic talent. What you will do in this role Write performing code with End-2-End tests following TDD methodology. Ship projects continuously and on time. Build well suited Design Patterns. Work on Swift UI, Extensions & Widgets. Animations & Motion effects to deliver greater UX. In-App Subscription services. Understand the specifications from product, design, and QA - draft a solution followed by a team discussion on feasibility, architecture, design, etc. before implementations. What you should have 2 to 5 years of development experience of consumer products with hands-on experience in designing, developing and testing applications. Experience in Swift, Auto Layouts, TDDs and willingness to learn more. Well versed with Core Data, Architecture & Design Patterns, Data Structures and Algorithms, etc. Passion for finding and sharing best practices and driving discipline for superior code quality. Working knowledge of Xcode & code signing. BE (Comp/IT), ME (Comp/IT), MCA, M.Tech, B.Tech
Posted 1 month ago
8.0 - 15.0 years
14 - 19 Lacs
Pune
Work from Office
Ensure compliance with architectural principles and development standards Ensure solution designs address performance, availability, security and supportability challenges, as well as business functional requirements Work with colleagues from partner teams globally to translate business and technical requirements into solutions Ensure DevSecOps automation strategies for all solutions Ensure successful delivery of solutions into Production environment Provide support for live IT services Carries out activities that are large in scope, cross-functional and technically difficult. Take an active role in the mentoring and development of more junior resources Drive Engineering Excellence through Non-functional aspects. Develop data integration interfaces, APIs and micro services Develop test automation suites for API testing Design detailed solutions based on tech stack detailed below Work with tech stakeholders across multiple systems and regions Requirements 8 to 15 years of strong experience on technical stack - Java, micro services, API and Advanced SQL for data analytics Experience on GCP Big query, Pubsub and monitoring tools. Good to have experience in Air Flow implementation Proven experience on IT service delivery using Agile methodology along with automation testing framework. Experience in finance domain and preferred experience in Custody or Asset Servicing. Strong interpersonal capabilities and a team player. Excellent communication in both written and verbal in English, conflict management and problem-solving skill. Experience of architecture, change and operational aspects of technology. Proven ability to work across regions whilst maintaining a global perspective. Strong understanding of technology and IT application, up to date with latest technology trends and ideas in the wider market. Exposure to Java / data is advantage Can understand, build and present business cases and technical solution / design to senior stakeholders, business sponsor or clients. Sector functional requirements: Custody and Asset Servicing. Significant experience in the Custody and/or Asset Servicing domain, preferable on BaNCS or similar vendor products. Preferably experience of Securities Services, Custody or Securities Operations Technology in a Bank. Proven implementation of Client journey and design thinking Used to translate stakeholder aspiration into technical design. Familiarity with Financial Markets and related asset classes will be an advantage. Understanding and awareness of appropriate corporate and regulatory policies Understanding and awareness of cutting age technologies including AI/ML Sector nice to have: Experience with large scale data architecture, across multiple or hybrid Cloud platform and 3rd Party applications. Hands on experience with Kubernetes (Configmap / Secrets / Hashicorp Vault / Helm Charts). Experience in Kafka (including Kafka Avro, concept of partitioning). Experience in Databases / Oracle, NoSQL (MongoDB, Postgres Document model). Commercial acumen and good risk management and mitigation skills. Positive, proactive and can do attitude. Managing change and/or technology in a Global Investment Banking environment will be advantage. Experience of integrating vendor platforms on complex business lines / functions will be advantage.
Posted 1 month ago
12.0 - 16.0 years
39 - 45 Lacs
Bengaluru
Work from Office
Seeking Solution Architect Data Engineering to lead design & implementation of scalable, robust & secure data solutions. You will play a pivotal role in defining architecture standards, designing modern data platforms & data strategies execution.
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The primary responsibility of this role is to evolve the WCS IT landscape using a data-driven approach. You will be tasked with defining, designing, and driving the data architecture roadmap for WCS IT, which includes enterprise data architecture. Collaborating closely with Product Owners, Business & IT stakeholders across the WCS IT landscape will be essential to understand requirements and provide solutions tailored to meet the needs. You will also be responsible for creating DFDs and data lineage diagrams to comprehend the current state and work with stakeholders to align with the future roadmap while adhering to HSBC defined standards. Understanding system integrations within and outside WCS IT applications, API integration, and various data formats like JSON, XML, etc., will be a crucial part of the role. Analyzing business requirements related to data flows across applications/systems and collaborating with Product owners and POD teams for development aligned with data architecture and designs will also be a key aspect. Additionally, designing and developing tools for automation and scripting for data extraction, transformation, and loading as per business requirements will be part of your responsibilities. Collaborating with Program, Department, and Enterprise architects of HSBC to drive data architecture and deliver expected business outcomes by designing, developing, and implementing solutions is vital. Following the DevOps model in day-to-day delivery and fostering innovation within the team by encouraging the development of new ideas, PoCs, exploring new technologies, participating in hackathons, and attending forums and sessions will be encouraged. Ensuring adherence to best practices, guidelines, and data security standards in the BFS industry is crucial. The ideal candidate for this role must have a vast experience in data architecture, defining and driving enterprise data architecture. Proficiency in understanding microservices architecture, REST APIs, reading and parsing JSON, XML, and other data structures, as well as developing an understanding of data attributes and values between systems is required. Experience with data modeling, DFD, Data lineage tools like Visio and reverse engineering tools is essential. Previous experience working in the Banking and Financial service industry, specifically with an MNC Bank of the size of HSBC, is mandatory. Moreover, the candidate must possess experience in analyzing large datasets, JSONs, XMLs, and other formats, writing scripts for data extraction, transformation, and loading using tools or developed automation tools. Working experience with databases like PostgreSQL, Oracle, MongoDB, and others is necessary. Familiarity with Agile and DevSecOps methodologies is a must. An inclination to explore new technologies, innovate beyond project work, and excellent communication skills both written and verbal are prerequisites for this role. About the Company: Purview is a leading Digital Cloud & Data Engineering company headquartered in Edinburgh, United Kingdom, with a presence in 14 countries, including India, Poland, Germany, Finland, Netherlands, Ireland, USA, UAE, Oman, Singapore, Hong Kong, Malaysia, and Australia. The company has a strong presence in the UK, Europe, and APEC regions, providing services to Captive Clients such as HSBC, NatWest, Northern Trust, IDFC First Bank, Nordia Bank, among others. Purview also supports various top-tier IT organizations to deliver solutions and workforce/resources. Company Info: 3rd Floor, Sonthalia Mind Space Near Westin Hotel, Gafoor Nagar Hitechcity, Hyderabad Phone: +91 40 48549120 / +91 8790177967 Gyleview House, 3 Redheughs Rigg, South Gyle, Edinburgh, EH12 9DQ. Phone: +44 7590230910 Email: careers@purviewservices.com Login to Apply!,
Posted 1 month ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Lead Data Engineer at EY, you will play a crucial role in leading large scale solution architecture design and optimization to provide streamlined insights to partners throughout the business. You will lead a team of Mid- and Senior data engineers to collaborate with visualization on data quality and troubleshooting needs. Your key responsibilities will include implementing data processes for the data warehouse and internal systems, leading a team of Junior and Senior Data Engineers in executing data processes, managing data architecture, designing ETL processes, cleaning, aggregating, and organizing data from various sources, and transferring it to data warehouses. You will be responsible for leading the development, testing, and maintenance of data pipelines and platforms to enable data quality utilization within business dashboards and tools. Additionally, you will support team members and direct reports in refining and validating data sets, create, maintain, and support the data platform and infrastructure, and collaborate with various teams to understand data requirements and design solutions that enable advanced analytics, machine learning, and predictive modeling. To qualify for this role, you must have a Bachelor's degree in Engineering, Computer Science, Data Science, or related field, along with 9+ years of experience in software development, data engineering, ETL, and analytics reporting development. You should possess expertise in building and maintaining data and system integrations using dimensional data modeling and optimized ETL pipelines, as well as experience with modern data architecture and frameworks like data mesh, data fabric, and data product design. Other essential skillsets include proficiency in data engineering programming languages such as Python, distributed data technologies like Pyspark, cloud platforms and tools like Kubernetes and AWS services, relational SQL databases, DevOps, continuous integration, and more. You should have a deep understanding of database architecture and administration, excellent written and verbal communication skills, strong organizational skills, problem-solving abilities, and the capacity to work in a fast-paced environment while adapting to changing business priorities. Desired skillsets for this role include a Master's degree in Engineering, Computer Science, Data Science, or related field, as well as experience in a global working environment. Travel requirements may include access to transportation to attend meetings and the ability to travel regionally and globally. Join EY in building a better working world, where diverse teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across various sectors.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
lucknow, uttar pradesh
On-site
About Agoda Agoda is an online travel booking platform that offers accommodations, flights, and more to travelers worldwide. With a global network of 4.7M hotels and holiday properties, as well as flights, activities, and more, we are dedicated to connecting travelers with seamless travel experiences. As part of Booking Holdings and based in Asia, our team of 7,100+ employees from 95+ nationalities across 27 markets creates a work environment that thrives on diversity, creativity, and collaboration. At Agoda, we foster a culture of innovation through experimentation and ownership, allowing our customers to explore and enjoy the world. Our Purpose: Bridging the World Through Travel We believe that travel enriches lives by providing opportunities to learn, experience, and appreciate the beauty of our world. By bringing people and cultures closer together, travel promotes empathy, understanding, and happiness. The Data Team at Agoda The Data department at Agoda is responsible for overseeing all data-related requirements within the company. Our primary objective is to enhance the utilization of data through innovative approaches and the implementation of robust resources such as operational and analytical databases, queue systems, BI tools, and data science technology. We recruit talented individuals from diverse backgrounds globally to tackle this challenge, providing them with the necessary knowledge and tools for personal growth and success while upholding our company's values of diversity and experimentation. The Data team at Agoda plays a crucial role in supporting business users, product managers, engineers, and others in their decision-making processes. We are committed to improving the search experience for our customers by delivering faster results and ensuring protection against fraudulent activities. The abundance of data available to us presents both a challenge and a reward, driving our passion for excellence within the Data department. The Opportunity As a senior data pipeline engineer at Agoda, you will be working on distributed systems that span multiple data centers, thousands of servers, and process hundreds of billions of messages daily. Ensuring data quality, integrity, and accuracy is fundamental to our operations. You will be involved in designing scalable systems to handle the increasing volume of data, including auditing and monitoring functionalities. This role provides you with the opportunity to lead projects with a small team, enhancing your ownership and leadership skills. You will tackle complex problems related to managing and interpreting large datasets, such as schema registry, real-time data-ingestion, cross-data center replication, data enrichment, storage, and analytics. In This Role, You'll Get to - Build, administer, and scale data pipelines processing hundreds of billions of messages daily across multiple data centers - Develop and enhance existing frameworks used by teams throughout Agoda to contribute messages to the data pipeline - Manage data ingestion into various systems (Hadoop, ElasticSearch, other Distributed Systems) - Create tools to monitor high data accuracy SLAs for the data pipeline - Explore new technologies to improve data quality, processes, and flow - Develop high-quality software through design reviews, code reviews, and test-driven development What You'll Need To Succeed - Bachelors degree in Computer Science, Information Systems, Computer Engineering, or a related field - 8+ years of industry experience, preferably in a tech company - Strong knowledge of data architecture principles - Experience in debugging production issues - Proficient in coding and building purpose-driven, scalable, well-tested, and maintainable systems - Detail-oriented with a focus on considering all outcomes of decisions - Excellent communication skills in technical English, both verbally and in writing - Proficiency in multiple programming languages (e.g., Golang, Java, Scala, Python, C#) - Good understanding of Kafka and experience as a Kafka Administrator - Experience with data ingestion from Kafka into Hadoop, ElasticSearch, and other Distributed Systems - Strong systems administration skills in Linux - Previous involvement in or contribution to Open Source Projects Equal Opportunity Employer Agoda is an equal opportunity employer. We value diversity and welcome applications from individuals with a variety of backgrounds and experiences. We will retain your application for future vacancies and allow you to request the removal of your details if desired. For more information, please refer to our privacy policy. Note: Agoda does not accept third-party resumes. Kindly refrain from sending resumes to our jobs alias, Agoda employees, or any other organizational location. Agoda will not be liable for any fees associated with unsolicited resumes.,
Posted 1 month ago
15.0 - 21.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Data Architect with over 15 years of experience, your primary responsibility will be to lead the design and implementation of scalable, secure, and high-performing data architectures. You will collaborate with business, engineering, and product teams to develop robust data solutions that support business intelligence, analytics, and AI initiatives. Your key responsibilities will include designing and implementing enterprise-grade data architectures using cloud platforms such as AWS, Azure, or GCP. You will lead the definition of data architecture standards, guidelines, and best practices while architecting scalable data solutions like data lakes, data warehouses, and real-time streaming platforms. Collaborating with data engineers, analysts, and data scientists, you will ensure optimal solutions are delivered based on data requirements. In addition, you will oversee data modeling activities encompassing conceptual, logical, and physical data models. It will be your duty to ensure data security, privacy, and compliance with relevant regulations like GDPR and HIPAA. Defining and implementing data governance strategies alongside stakeholders and evaluating data-related tools and technologies are also integral parts of your role. To excel in this position, you should possess at least 15 years of experience in data architecture, data engineering, or database development. Strong experience in architecting data solutions on major cloud platforms like AWS, Azure, or GCP is essential. Proficiency in data management principles, data modeling, ETL/ELT pipelines, and modern data platforms/tools such as Snowflake, Databricks, and Apache Spark is required. Familiarity with programming languages like Python, SQL, or Java, as well as real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub, will be beneficial. Moreover, experience in implementing data governance, data cataloging, and data quality frameworks is important. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills are necessary for this role. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is preferred, along with certifications like Cloud Architect or Data Architect (AWS/Azure/GCP). Join us at Infogain, a human-centered digital platform and software engineering company, where you will have the opportunity to work on cutting-edge data and AI projects in a collaborative and inclusive work environment. Experience competitive compensation and benefits while contributing to experience-led transformation for our clients in various industries.,
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
haryana
On-site
As the Financial Services (FSO) division of Ernst & Young, you will have the unique opportunity to be part of a professional services organization dedicated exclusively to the financial services marketplace. Joining our multi-disciplinary teams from around the world, you will play a crucial role in delivering a global perspective. Aligned with key industry groups such as asset management, banking and capital markets, insurance, and private equity, we offer integrated advisory, assurance, tax, and transaction services. Through diverse experiences, world-class learning opportunities, and individually tailored coaching, you will undergo continuous professional development. Our focus is on developing exceptional leaders who collaborate effectively to fulfill our commitments to all stakeholders, thereby contributing significantly to building a better working world for our people, clients, and communities. Excited to be a part of this journey This is just the beginning, as the exceptional EY experience will stay with you for a lifetime. As a future FSO Technology Consultant at EY, you will be part of a team that helps clients navigate complex industry challenges and leverage technology to enhance business operations. Your role will involve addressing business and strategic challenges such as business and solution architecture, digital transformation, project management, and design of digital operating models. Additionally, you will work on technical matters including data science, advanced analytics, IoT, data governance, blockchain, artificial intelligence, and robotic process automation. Joining our team means working on critical projects within the financial services landscape, with opportunities to transition between teams as both you and our dynamic business continue to grow and evolve. Your contributions will be instrumental in propelling EY to new heights. We are currently seeking individuals for the following positions: - Cybersecurity - Digital - Platform - Data & Analytics To qualify for a role in our team, you must have: - A Bachelor's or Master's Degree in (Business) Engineering, Computer Science, Information Systems Management, Mathematics, (applied) Economics, or a related field with an interest in cutting-edge technologies. - Strong analytical skills. - Knowledge of project management methodologies, including agile, traditional, and hybrid approaches. - Proficiency in English at an advanced level. - Experience in team leadership. - Exceptional oral and written communication abilities. If you believe you meet the above criteria, we encourage you to apply at your earliest convenience. The exceptional EY experience awaits you, ready for you to shape and build upon.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Lead Software Engineer at JPMorgan Chase within Asset and Wealth Management, you play a crucial role in an agile team dedicated to enhancing, building, and delivering cutting-edge technology products in a secure, stable, and scalable manner. Your primary responsibility lies in developing innovative technology solutions across various technical domains to support the firm's business objectives effectively. Your key job responsibilities include creating, managing, and updating accurate Architecture Current, Target state, and Target State Roadmaps for your application portfolio. You are expected to leverage your expertise as a business domain expert to align technical capabilities with the business strategy, ensuring the realization of desired business outcomes. Additionally, you will collaborate with product owners and application teams to establish and maintain business process flows for the portfolio. You will also take ownership of data domains, data products, and data models in coordination with product owners, data owners, and application teams. Furthermore, you will actively participate in data & domain architecture governance bodies, evaluate new technologies, and provide valuable feedback. Your role involves devising creative data architecture solutions, conducting design and development activities, and troubleshooting technical issues with a forward-thinking mindset. You will identify opportunities for process automation to enhance the operational stability of software applications and systems. Moreover, you will lead evaluation sessions with external vendors, startups, and internal teams to assess data architectural designs and their applicability within the existing systems and information architecture. Additionally, you will spearhead data architecture communities of practice to promote the adoption of modern data architecture technologies. To excel in this role, you must possess formal training or certification in software engineering concepts along with at least 5 years of practical experience. Ideal candidates will have prior experience in Wealth Management technology, encompassing Wealth Planning & Advice, Investing, Lending, and Banking, with proficiency across various asset classes such as Fixed Income, Equities, and Alternatives. A degree in Computer Science, Engineering, or a related field is preferred. Your skillset should include a strong command of software development methodologies, architecture frameworks, design patterns, testing practices, and operational stability. Effective leadership, communication, and problem-solving capabilities are essential, as well as the ability to establish robust engineering communities and guilds. Demonstrated experience in influencing cross-functional teams to deliver modern architecture solutions is highly valued.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
haryana
On-site
As a Lead Platform Engineer at our esteemed organization, you will play a pivotal role in designing and constructing cloud-based distributed systems that tackle intricate business dilemmas for some of the largest companies globally. Leveraging your profound expertise in software engineering, cloud engineering, and DevOps, you will be instrumental in crafting technology stacks and platform components that empower cross-functional AI Engineering teams to develop robust, observable, and scalable solutions. Being part of a diverse and globally dispersed engineering team, you will engage in the complete engineering lifecycle, encompassing the design, development, optimization, and deployment of solutions and infrastructure on a scale that caters to the needs of the world's leading corporations. Your core responsibilities will revolve around: - Crafting cloud solution and distributed systems architecture for full stack AI software and data solutions - Implementing, testing, and managing Infrastructure as Code (IAC) of cloud-based solutions, inclusive of CI/CD, data integrations, APIs, web and mobile apps, and AI solutions - Defining and executing scalable, observable, manageable, and self-healing cloud-based solutions across AWS, Google Cloud, and Azure - Collaborating with diverse teams, including product managers, data scientists, and fellow engineers, to define and implement analytics and AI features that align with business requirements and user needs - Harnessing Kubernetes and containerization technologies to deploy, manage, and scale analytics applications in cloud environments, ensuring optimal performance and availability - Developing and maintaining APIs and microservices to expose analytics functionality to both internal and external consumers, adhering to best practices for API design and documentation - Implementing robust security protocols to safeguard sensitive data and uphold compliance with data privacy regulations and organizational policies - Continuously monitoring and troubleshooting application performance, identifying and resolving issues that impact system reliability, latency, and user experience - Participating in code reviews and contributing to the establishment and enforcement of coding standards and best practices to ensure high-quality, maintainable code - Staying abreast of emerging trends and technologies in cloud computing, data analytics, and software engineering, and proactively identifying opportunities to enhance the capabilities of the analytics platform - Collaborating closely with and influencing business consulting staff and leaders as part of multi-disciplinary teams to assess opportunities and develop analytics solutions for Bain clients across various sectors - Influencing, educating, and directly supporting the analytics application engineering capabilities of our clients To be successful in this role, you should possess: - A Master's degree in Computer Science, Engineering, or a related technical field - 6+ years of experience, with at least 3+ years at the Staff level or equivalent - Proven expertise as a cloud engineer and software engineer in either product engineering or professional services organizations - Experience in designing and delivering cloud-based distributed solutions, with GCP, AWS, or Azure certifications being advantageous - Deep familiarity with software development lifecycle nuances - Proficiency in one or more configuration management tools such as Ansible, Salt, Puppet, or Chef - Proficiency in one or more monitoring and analytics platforms like Grafana, Prometheus, Splunk, SumoLogic, NewRelic, DataDog, CloudWatch, Nagios/Icinga - Experience with CI/CD deployment pipelines (e.g., Github Actions, Jenkins, Travis CI, Gitlab CI, Circle CI) - Experience in building backend APIs, services, and/or integrations using Python - Practical experience with Kubernetes through services like GKE, EKS, or AKS is beneficial - Ability to collaborate effectively with internal and client teams and stakeholders - Proficiency in Git for versioning and collaboration - Exposure to LLMs, Prompt engineering, Langchain is a plus - Experience with workflow orchestration tools like dbt, Beam, Airflow, Luigy, Metaflow, Kubeflow, or any other - Experience in the implementation of large-scale structured or unstructured databases, orchestration, and container technologies such as Docker or Kubernetes - Strong interpersonal and communication skills, enabling you to explain and discuss complex engineering technicalities with colleagues and clients from different disciplines at their level of cognition - Curiosity, proactivity, and critical thinking - Sound knowledge of computer science fundamentals in data structures, algorithms, automated testing, object-oriented programming, performance complexity, and the implications of computer architecture on software performance - Strong understanding of designing API interfaces - Knowledge of data architecture, database schema design, and database scalability - Familiarity with Agile development methodologies At our organization, Bain & Company, a global consultancy dedicated to assisting the world's most ambitious change-makers in shaping the future, we operate across 65 cities in 40 countries. Collaborating closely with our clients, we work as one team with a shared objective of achieving exceptional results, outperforming competitors, and redefining industries. Our commitment to investing more than $1 billion in pro bono services over the next decade reflects our dedication to supporting organizations addressing pressing challenges in education, racial equity, social justice, economic development, and the environment. With a platinum rating from EcoVadis, a prominent platform for environmental, social, and ethical performance ratings for global supply chains, we stand in the top 1% of all companies. Since our inception in 1973, we have gauged our success by the success of our clients and proudly maintain the highest level of client advocacy in the industry.,
Posted 1 month ago
10.0 - 18.0 years
2 - 3 Lacs
Hyderabad
Work from Office
Experience needed: 12-18 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are looking for an experienced and visionary Data Architect - Azure Data & Analytics to lead the design and delivery of scalable, secure, and modern data platform solutions leveraging Microsoft Azure and Microsoft Fabric . This role requires deep technical expertise in the Azure Data & Analytics ecosystem, strong experience in designing cloud-native architectures, and a strategic mindset to modernize enterprise data platforms. Key Responsibilities: Architect and design Modern Data Platform solutions on Microsoft Azure, including ingestion, transformation, storage, and visualization layers. Lead implementation and integration of Microsoft Fabric , including OneLake, Direct Lake mode, and Fabric workloads (Data Engineering, Data Factory, Real-Time Analytics, Power BI). Define enterprise-level data architecture , including data lakehouse patterns, delta lakes, data marts, and semantic models. Collaborate with business stakeholders, data engineers, and BI teams to translate business needs into scalable cloud data solutions. Design solutions using Azure-native services such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure SQL, and Azure Event Hubs. Establish best practices for data security, governance, DevOps, CI/CD pipelines, and cost optimization. Guide implementation teams on architectural decisions and technical best practices across the data lifecycle. Develop reference architectures and reusable frameworks for accelerating data platform implementations. Stay updated on Microsofts data platform roadmap and proactively identify opportunities to enhance data strategy. Assist in developing RFPs, architecture assessments, and solution proposals. Requirements Required Skills & Qualifications: Proven 12-18 years of experience which includes designing and implementing cloud-based modern data platforms on Microsoft Azure. Deep knowledge and understanding of Microsoft Fabric architecture , including Data Factory, Data Engineering, Synapse Real-Time Analytics, and Power BI integration. Expertise in Azure Data Services : Azure Synapse, Data Factory, Azure SQL, ADLS Gen2, Azure Functions, Azure Purview, Event Hubs, etc. Experience of data warehousing, lakehouse architectures, ETL/ELT , and data modeling. Experience in data governance, security, role-based access (Microsoft Entra/Azure AD) , and compliance frameworks. Strong leadership and communication skills to influence both technical and non-technical stakeholders. Familiarity with DevOps and infrastructure-as-code (e.g., ARM templates, Bicep, Terraform) is a plus. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert , Azure Data Engineer Associate , or Microsoft Fabric Certification . Experience with real-time data streaming , IoT , or machine learning pipelines in Azure. Familiarity with multi-cloud data strategies or hybrid deployments is an advantage.
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Principal Data Engineer at Skillsoft, you will play a crucial role in driving the advancement of Enterprise data infrastructure by designing and implementing the logic and structure for how data is set up, cleansed, and stored for organizational usage. You will be responsible for developing a Knowledge Management strategy to support Skillsoft's analytical objectives across various business areas. Your role will involve building robust systems and reusable code modules to solve problems, working with the latest open-source tools and platforms to build data products, and collaborating with Product Owners and cross-functional teams in an agile environment. Additionally, you will champion the standardization of processes for data elements used in analysis, establish forward-looking data and technology objectives, manage a small team through project deliveries, and design rich data visualizations and interactive tools to communicate complex ideas to stakeholders. Furthermore, you will evangelize the Enterprise Data Strategy & Execution Team mission, identify opportunities to influence decision-making with supporting data and analysis, and seek additional data resources that align with strategic objectives. To qualify for this role, you should possess a degree in Data Engineering, Information Technology, CIS, CS, or related field, along with 7+ years of experience in Data Engineering/Data Management. You should have expertise in building cloud data applications, cloud computing, data engineering/analysis programming languages, and SQL Server. Proficiency in data architecture, data modeling, and experience with technology stacks for Metadata Management, Data Governance, and Data Quality are essential. Additionally, experience in working cross-functionally across an enterprise organization and an Agile methodology environment is preferred. Your strong business acumen, analytical skills, technical abilities, and problem-solving skills will be critical in this role. Experience with app and web analytics data, CRM, and ERP systems data is a plus. Join us at Skillsoft and be part of our mission to democratize learning and help individuals unleash their edge. If you find this opportunity intriguing, we encourage you to apply and be a part of our team dedicated to leadership, learning, and success at Skillsoft. Thank you for considering this role.,
Posted 1 month ago
7.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,
Posted 1 month ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
This is a full-time position with D Square Consulting Services Pvt Ltd. As a Senior Data Architect / Modeler, you will collaborate closely with various business partners, product owners, data strategy, data platform, data science, and machine learning teams to innovate data products for end users. Your role will involve shaping the overall solution architecture and defining models for data products using best-in-class engineering practices. By working with stakeholders, you will comprehend business requirements and design/build data models that support acquisition, ingestion processes, and critical reporting and insight needs. To be successful in this role, you must have a minimum of 12 years of experience, with at least 7 years in Data & Analytics initiatives. You should possess a deep understanding of how data architecture and modeling facilitate data pipelines, data management, and analytics. Additionally, you need 5+ years of experience in data architecture & modeling within Consumer/Healthcare Goods industries and hands-on experience in Cloud Architecture (Azure, GCP, AWS) and related databases like Synapse, Databricks, Snowflake, and Redshift. Proficiency in SQL and familiarity with data modeling tools like Erwin or ER Studio is crucial. Your responsibilities will include leading data architecture and modeling efforts in collaboration with engineering and platform teams to develop next-generation product capabilities that drive business growth. You will focus on delivering reliable, high-quality data products to maximize business value and work within the DevSecOps framework to enhance data & analytics capabilities. Collaborating with Business Analytics leaders, you will translate business needs into optimal architecture designs and design scalable and reusable models for various functional areas of data products while adhering to FAIR principles. In this role, you will also collaborate with data engineers, solution architects, and other stakeholders to maintain and optimize data models. You will establish trusted partnerships with Data Engineering, Platforms, and Data Science teams to create business-relevant data models and ensure the maintenance of Metadata Rules, Data Dictionaries, and associated lineage details. Additionally, staying updated with emerging technologies and mentoring other data modelers in the team will be a part of your responsibilities. Qualifications for this position include an undergraduate degree in Technology, Computer Science, Applied Data Sciences, or related fields, with an advanced degree being preferred. Experience in NoSQL and graphDB databases, as well as hands-on experience with data catalogs like Alation, Collibra, or similar tools, is beneficial. You should have a strong ability to challenge existing technologies and architecture while effectively influencing across the organization. Lastly, experience in a diverse company culture and a commitment to inclusion and equal-opportunity employment are desired traits for this role.,
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Data Architect / Modeler, you will collaborate closely with various business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) teams to drive the development of innovative data products for end users. Your role will involve contributing to the development of overall solution architecture and defining models for data products by employing best-in-class engineering practices. By working with different stakeholders, you will gain insights into business needs and develop data models that support acquisition, ingestion processes, and critical reporting and insight requirements. With a minimum of 8 years of experience, including 5+ years of progressive experience in Data & Analytics initiatives, you are expected to possess knowledge of how data architecture and modeling support data pipelines, data management, and analytics. Additionally, you should have at least 3 years of hands-on experience in Cloud Architecture (Azure, GCP, AWS) and cloud-based databases (e.g., Synapse, Databricks, Snowflake, Redshift), along with 3+ years of data architecture & modeling experience in Consumer/Healthcare Goods companies. Your expertise should include proficiency in SQL, Erwin/ER Studio, and data modeling, as well as experience in designing and developing performance-tuned, reusable, and scalable data model standards and data dictionaries. Hands-on experience with data catalogs like Alation, Collibra, or similar tools is desirable. Furthermore, you should have 3 years of experience working with Agile methodology (Scrum/Kanban) in the DevSecOps model and possess strong interpersonal and communication skills. In this role, you will be responsible for providing guidance on data architecture and modeling to engineering and platform teams to create next-generation product capabilities that drive business growth. You will focus on the delivery of reliable, high-quality data products to maximize business value and collaborate with Business Analytics leaders to translate business needs into optimal architecture designs. Your responsibilities will include designing data architecture and scalable, reusable models for various functional areas of data products while adhering to "FAIR" principles. You will collaborate with data engineers, solution architects, and other stakeholders on data model maintenance and optimization and establish trusted partnerships with Data Engineering, Platforms, and Data Science teams to architect business-relevant data models. Additionally, you will create and maintain Metadata Rules, Data Dictionaries, and associated lineage details of the data models.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The Chief Data & Analytics Office (CDAO) at JPMorgan Chase is responsible for accelerating the firm's data and analytics journey. This includes ensuring the quality, integrity, and security of the company's data, as well as leveraging this data to generate insights and drive decision-making. The CDAO is also responsible for developing and implementing solutions that support the firm's commercial goals by harnessing artificial intelligence and machine learning technologies to develop new products, improve productivity, and enhance risk management effectively and responsibly. Within CDAO, The Firmwide Chief Data Office (CDO) is responsible for maximizing the value and impact of data globally, in a highly governed way. It consists of several teams focused on accelerating JPMorgan Chase's data, analytics, and AI journey, including data strategy, data impact optimization, privacy, data governance, transformation, and talent. As a Senior Associate at JPMorgan Chase within the Chief Data & Analytics team, you will be responsible for working with stakeholders to define governance and tooling requirements and building out the BCBS Data Governance framework. In addition, you will be responsible for delivering tasks in detailed project plans for the BCBS deliverables owned by the Firmwide CDO. Lastly, you will play a role in developing and syndicating the content used for the BCBS governance meetings. **Job Responsibilities:** - Deliver on the BCBS book of work owned by the Firmwide CDO - Support the definition, prioritization, and resolution of governance and requirements decisions needed by the BCBS program - Collect, synthesize, analyze, and present project data and findings - Conduct analyses to identify issues and formulate recommendations - Develop regular, compelling communications on project status - Research data governance requirements and potential solutions - Collaborate effectively across organizations, functions, and geographies **Required qualifications, capabilities, and skills:** - Formal training or certification on Data Governance concepts and 3+ years applied experience - Diverse problem-solving experience - Excellent communication skills (oral and written) and the ability to work effectively in cross-functional teams - Excellent project management and organizational skills, with the ability to manage multiple deliverables and work simultaneously - Strong interpersonal leadership and influencing skills - Proficiency in MS Excel and PowerPoint **Preferred qualifications, capabilities, and skills:** - Familiarity with data management and governance, big data platforms, or data architecture is preferred - BS/BA degree or equivalent experience / Bachelor's degree in Business, Finance, Economics, or other related area,
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
kolkata, west bengal
On-site
As the Solution Architect for Salesforce CPQ at RSM, you will have the opportunity to work with a dynamic team that is dedicated to delivering exceptional professional services to the middle market globally. RSM's purpose is to instill confidence in a world of change, empowering clients and individuals to reach their full potential. With a workforce of over 15,000 employees in the U.S., Canada, and a global presence in 120 countries, RSM focuses on providing audit, tax, and consulting services to drive economic growth and understanding. Your role will involve serving as a subject matter expert on Salesforce Configure, Price, Quote projects, overseeing client delivery, proposals, new business opportunities, and knowledge management. You will collaborate with key decision-makers and company owners to understand their challenges and provide well-architected solutions that leverage the Salesforce platform effectively. Key Responsibilities: - Deliver as an individual contributor and lead a team of Business Analysts, Consultants, Developers, or Solution Architects in various projects. - Collaborate with Business Analysts to capture and understand client business requirements and recommend best practices. - Translate business requirements into well-architected solutions on the Salesforce platform, taking ownership of the solution design and project delivery. - Lead technical design sessions, estimate user stories, and develop technical solution documentation aligned with business objectives. - Demonstrate Salesforce CPQ expertise and educate clients on best practices while ensuring adherence across the implementation team. - Assist in creating best practice assets and accelerators in Lead-to-Cash/Quote-to-Cash processes. - Provide Salesforce CPQ expertise during sales efforts and stay updated on new product releases and capabilities from Salesforce. - Coach and mentor junior resources while maintaining responsibility for configuration and development on Salesforce projects. Qualifications: - 7-10 years of overall experience with at least 5 years of hands-on Salesforce CPQ experience and 2+ years in leading Salesforce CPQ project implementations as a Solution Architect. - Strong communication and interpersonal skills, ability to manage tasks and drive issues to resolution. - Hold Salesforce CPQ Specialist and Salesforce Certified Administrator certifications; experience with Salesforce Revenue Cloud is a plus. - Proficiency in Agile methodologies and administering comprehensive training to end-users and CPQ admin users. - Preferred certifications include Revenue Cloud Accredited Professional, Community Cloud Consultant, Sales Cloud Consultant, and more. If you are looking for a challenging yet rewarding role where you can make a real impact, RSM offers a competitive benefits package and a supportive work environment that encourages personal and professional growth. Join us and be part of a team that values diversity, innovation, and excellence in client service. Apply now and discover the endless opportunities at RSM.,
Posted 1 month ago
8.0 - 13.0 years
15 - 20 Lacs
Pune
Hybrid
EY is hiring for Leading Client for Data Governance Senior Analyst role for Pune location Role & responsibilities Coordinating with Data Srewards/Data Owners to enable identification of Critical data elements for SAP master Data Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Develop both end-state and interim-state architecture for master data, ensuring alignment with business requirements and industry best practices. Define and implement data models that align with business needs and Gather requirements for master data structures. Design scalable and maintainable data models by ensuring data creation through single source of truth Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Collaborate with the Data Governance Manager to advance the data governance agenda. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities. Collaborate with IT, data management, and business units to implement data governance best practices and migrate from ECC to S/4 MDG Monitor data governance activities, measure progress, and report on key metrics to senior management. Conduct training sessions and create awareness programs to promote data governance within the organization. Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc. Summary: SAP Master Data (Vendor, Customer, GL, Cost Center, etc.) Data Governance Implementation (Transactional & Master Data) Data Modeling & Architecture (S/4HANA, ECC) Data Cataloging, Lineage, and Quality Assessment Governance Forums & Change Advisory Boards Experience in S/4HANA Greenfield implementations Migration Experience (ECC to S/4 MDG) Preferred candidate profile 8-14 years in data governance and SAP master data Strong understanding of upstream/downstream data impacts Expert in data visualization
Posted 1 month ago
8.0 - 13.0 years
25 - 40 Lacs
Gurugram, Mumbai (All Areas)
Work from Office
About the role In this role, we are seeking an experienced, hands-on, and innovative Data Architect with expertise in the Azure Cloud platform to design, implement, and optimize scalable data solutions. The ideal candidate will have deep expertise in data architecture, cloud solutions, and experience with healthcare and benefits data systems. They will work closely with a cross functional team of Data, DevOps, and Analytics engineers to architect a robust data platform for H&B, ensure efficient data management, and support enterprise-level decision making process. The Role - Partner with other architecture resources to lead the end-to-end architecture of the health and benefits data platform using Azure services, ensuring scalability, flexibility, and reliability. Develop broad understanding of the data lake architecture, including the impact of changes on a whole system, the onboarding of clients and the security implications of the solution. Design a new or improve upon existing architecture including data ingestion, storage, transformation and consumption layers. Define data models, schemas, and database structures optimized for H&B use cases including claims, census, placement, broking and finance sources. Designing solutions for seamless integration of diverse health and benefits data sources. Implement data governance and security best practices in compliance with industry standards and regulations using Microsoft Purview. Evaluate data lake architecture to understand how technical decisions may impact business outcomes and suggest new solutions/technologies that better align to the Health and Benefits Data strategy. Draw on internal and external practices to establish data lake architecture best practices and standards within the team and ensure that they are shared and understood. Continuously develop technical knowledge and be recognised as a key resource across the global team. Collaborate with other specialists and/or technical experts to ensure H&B Data Platform is delivering to the highest possible standards and that solutions support stakeholder needs and business requirements. Initiate practices that will increase code quality, performance and security. Develop recommendations for continuous improvements initiatives, applying deep subject matter knowledge to provide guidance at all levels on the potential implications of changes. Build the teams technical expertise/capabilities/skills through the delivery of regular feedback, knowledge sharing, and coaching. Analyze existing data design and suggest improvements that promote performance, stability and interoperability. Work with product management and business subject matter experts to translate business requirements into good data lake design. Maintain the governance model on the data lake architecture through training, design reviews, code reviews, and progress reviews. Participate in the development of Data lake Architecture and Roadmaps in support of business strategies
Posted 1 month ago
6.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Having Database Administration skills Experience with data security, backup, and disaster recovery strategies. Implementing WLM on any of the databases (preferably DB2) Experience implementing data governance frameworks, including metadata management and data cataloging tools.
Posted 1 month ago
2.0 - 7.0 years
4 - 9 Lacs
Pune, Bengaluru, Vadodara
Work from Office
{"company":" About Rearc At Rearc, were committed to empowering engineers to build awesome products and experiences. Success as a business hinges on our peoples ability to think freely, challenge the status quo, and speak up about alternative problem-solving approaches. If youre an engineer driven by the desire to solve problems and make a difference, youre in the right place! Our approach is simple empower engineers with the best tools possible to make an impact within their industry. Were on the lookout for engineers who thrive on ownership and freedom, possessing not just technical prowess, but also exceptional leadership skills. Our ideal candidates are hands-on-keyboard leaders who dont just talk the talk but also walk the walk, designing and building solutions that push the boundaries of cloud computing. Founded in 2016, we pride ourselves on fostering an environment where creativity flourishes, bureaucracy is non-existent, and individuals are encouraged to challenge the status quo. Were not just a company; were a community of problem-solvers dedicated to improving the lives of fellow software engineers. Our commitment is simple - finding the right fit for our team and cultivating a desire to make things better. If youre a cloud professional intrigued by our problem space and eager to make a difference, youve come to the right place. Join us, and lets solve problems together! ","role":" About the role As a Data Engineer at Rearc, youll contribute to the technical excellence of our data engineering team. Your expertise in data architecture, ETL processes, and data modeling will help optimize data workflows for efficiency, scalability, and reliability. Youll work closely with cross-functional teams to design and implement robust data solutions that meet business objectives and adhere to best practices in data management. Building strong partnerships with technical teams and stakeholders will be essential as you support data-driven initiatives and contribute to their successful implementation. What youll do Collaborate with Colleagues : Work closely with colleagues to understand customers data requirements and challenges, contributing to the development of robust data solutions tailored to client needs. Apply DataOps Principles : Embrace a DataOps mindset and utilize modern data engineering tools and frameworks like Apache Airflow, Apache Spark, or similar, to create scalable and efficient data pipelines and architectures. Support Data Engineering Projects : Assist in managing and executing data engineering projects, providing technical support and contributing to project success. Promote Knowledge Sharing : Contribute to our knowledge base through technical blogs and articles, advocating for best practices in data engineering, and fostering a culture of continuous learning and innovation. Were looking for: 2+ years of experience in data engineering, data architecture, or related fields, bringing valuable expertise in managing and optimizing data pipelines and architectures. Solid track record of contributing to complex data engineering projects, including assisting in the design and implementation of scalable data solutions. Hands-on experience with ETL processes, data warehousing, and data modelling tools, enabling the support and delivery of efficient and robust data pipelines. Good understanding of data integration tools and best practices, facilitating seamless data flow across systems. Familiarity with cloud-based data services and technologies (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery) ensuring effective utilization of cloud resources for data processing and analytics. Strong analytical skills to address data challenges and support data-driven decision-making. Proficiency in implementing and optimizing data pipelines using modern tools and frameworks. Strong communication and interpersonal skills enabling effective collaboration with cross-functional teams and stakeholder engagement. Your first few weeks at Rearc will be spent in an immersive learning environment where our team will help you get up to speed. Within the first few months, you ll have the opportunity to experiment with a lot of different tools as you find your place on the team. Rearc is committed to a diverse and inclusive workplace. Rearc is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. "},"
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Hubli, Mangaluru, Mysuru
Work from Office
Job Snapshot Location: Karnataka - Other, Karnataka Job ID: JN -032025-95763 Category: TEK-Technology Operations Management (TOM) Location: Karnataka - Other, Karnataka Job ID: Category: TEK-Technology Operations Management (TOM) Job Summary Job Opportunity with TEKsystems Snaplogic Developer Snaplogic, SQL, Kafka Are you a passionate "Snaplogic Developer" with a knack for SQL and seamless system integration? We want to hear from you! We re looking for a talented developer to join our team and help us drive innovation through cutting-edge integrations and efficient data pipelines. Years of Experience: 7+ years Location: Bangalore (Hybrid) What You ll Do: * Design, develop, and implement Snaplogic integrations for a variety of business needs * Utilize SQL to extract, manipulate, and manage data * Build scalable, reusable, and high-performance integration solutions * Collaborate with cross-functional teams to ensure smooth data flow across systems * Troubleshoot and optimize Snaplogic pipelines for maximum performance What We re Looking For: * Proven experience with Snaplogic, building and maintaining integration pipelines * Strong proficiency in SQL for data querying, transformation, and analysis * Experience in integrating various systems and APIs via Snaplogic * Solid understanding of data architecture and ETL processes * Excellent problem-solving skills and attention to detail * Strong communication and teamwork skills. Apply Get personalised tech job recommendations based on your skills.
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG s best practices policies and procedures. This role requires quick ramp up on new technologies whenever required. Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. . Role : Azure Data Engineer Location: Bangalore Experience: 4 to 6 years Data Management : Design, implement, and manage data solutions on the Microsoft Azure cloud platform. Data Integration : Develop and maintain data pipelines, ensuring efficient data extraction, transformation, and loading (ETL) processes using Azure Data Factory. Data Storage : Work with various Azure data storage solutions like Azure SQL Database, Azure Data Lake Storage, and Azure Cosmos DB. Big Data Processing : Utilize big data technologies such as Azure Databricks and Apache Spark to handle and analyze large datasets. Data Architecture : Design and optimize data models and architectures to meet business requirements. Performance Monitoring : Monitor and optimize the performance of data systems and pipelines. Collaboration : Collaborate with data scientists, analysts, and other stakeholders to support data-driven decision-making. Security and Compliance : Ensure data solutions comply with security and regulatory requirements. Technical Skills : Proficiency in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data tools. Analytical Skills : Strong analytical and problem-solving skills. Communication : Excellent communication and teamwork skills. Certifications : Relevant certifications such as Microsoft Certified: Azure Data Engineer Associate are a plus.
Posted 1 month ago
5.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Mandatory Skills: Big Data Consulting. Experience: 5-8 Years.
Posted 1 month ago
3.0 - 6.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |