Jobs
Interviews

5486 Databricks Jobs - Page 49

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 years

2 - 3 Lacs

Hyderābād

On-site

It's fun to work at a company where people truly believe in what they are doing! Job Description: Summary We are seeking a highly skilled Lead Software Engineer with sound knowledge on Java, SQL, multiple Cloud technologies and strong background with Full Stack Development for our Engineering Operations. As a Lead developer, you will work closely with business stake holders, Leadership, engineering team and DevOps Team. Job Responsibilities: Should be able to drive the project independently starting from requirement gathering to design, development and implement. Must have involved end-to-end building enterprise product. Excellent analytical and problem-solving skills, with the ability to understand complex business problems. Work on Cloud Technologies such as building and maintaining SaaS based applications, cloud migration,… Identify and evaluate new technologies to improve performance, maintainability, and reliability of existing machine learning systems. Developing and maintaining dashboards and other visualization tools to monitor key performance indicators. Proficiency in working with frameworks like Spring (Spring Boot, Security, Data, Cloud), Quarkus, etc. Clear understanding DevSecOps & DevOps process and help to implement. Communicating insights and recommendations to leadership, business stake holders in a clear and concise manner. Stay up to date with the latest advancements in Gen AI and recommend new tools and techniques to improve our operations. Adaptive to AI world. Provide technical guidance, resolve challenges, and foster a collaborative environment. Requirements / Skills : Bachelors / master’s degree in information or computer science required – B.Tech, MCA, MS Computers. 12+ years of experience. Must have Strong architectural background and Fullstack development experience. Proficient in Data Structure and RDBMS High Proficiency in Java programming and exposure to more than one cloud technology (Azure, AWS, Google). Proficiency in working with frameworks like Spring framework (Spring Boot, Security, Data, Cloud), Quarkus, etc. Implementation with visualization tools, libraries. UI Development – React, Node. Proficiency in integrating APIs. Proficient in Design Pattern Familiarity with Project Management tools Domain experience on Cyber, Data Security, Data breach, … Exposure to CI/CD, TDD, SAST, DAST, Kubernetes Strictly follow best practices of SDLC, Agile. Strong analytical and problem-solving skills. Added advantage – Java certification, ETL, Databricks, Spark … This role need strong and effective communication skills to deal with client, business and leadership. Competencies Integrity – Behaves in an honest, fair, and ethical manner; shows consistency in words and actions; does what she/he commits to doing; respects the confidentiality of information or concerns shared by others; is honest and forthright with people; carries his/her fair share of the workload; takes responsibility for own mistakes. Client Focus – Takes action with the clients, both internal and external, and sees their needs as a primary focus; builds a sustaining collaborative and productive relationship with clients; seeks to understand client situations, issues, expectations, etc.; takes appropriate action to meet client needs and address concerns; implements or utilizes methods to monitor and evaluate client feedback. Results-Driven – Sets stretch goals for personal and team accomplishment and works tenaciously to achieve those goals; acts with a sense of urgency; takes the initiative on actions; identifies what needs to be done and takes action before being asked; does more than what is normally required in a situation; establishes metrics to monitor progress and measure success; maintains focus by avoiding or overcoming roadblocks. Entrepreneurial Orientation – Proposes innovative business opportunities/ideas to customers and business partners; encourages and supports entrepreneurial behavior in others; demonstrates willingness to take calculated risks to achieve business goals. Decisiveness – Makes well-informed, effective, and timely decisions even when data is limited, or solutions produce unpleasant consequences; perceives the impact and implications of decisions; can make tough decisions. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! It is Epiq’s policy to comply with all applicable equal employment opportunity laws by making all employment decisions without unlawful regard or consideration of any individual’s race, religion, ethnicity, color, sex, sexual orientation, gender identity or expressions, transgender status, sexual and other reproductive health decisions, marital status, age, national origin, genetic information, ancestry, citizenship, physical or mental disability, veteran or family status or any other basis protected by applicable national, federal, state, provincial or local law. Epiq’s policy prohibits unlawful discrimination based on any of these impermissible bases, as well as any bases or grounds protected by applicable law in each jurisdiction. In addition Epiq will take affirmative action for minorities, women, covered veterans and individuals with disabilities. If you need assistance or an accommodation during the application process because of a disability, it is available upon request. Epiq is pleased to provide such assistance and no applicant will be penalized as a result of such a request. Pursuant to relevant law, where applicable, Epiq will consider for employment qualified applicants with arrest and conviction records.

Posted 2 weeks ago

Apply

12.0 years

1 - 3 Lacs

Hyderābād

On-site

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers to levels they cannot achieve anywhere else. This is a world of more possibilities, more innovation, more openness in a cloud-enabled world. The Business & Industry Copilots group is a rapidly growing organization that is responsible for the Microsoft Dynamics 365 suite of products, Power Apps, Power Automate, Dataverse, AI Builder, Microsoft Industry Solution and more. Microsoft is considered one of the leaders in Software as a Service in the world of business applications and this organization is at the heart of how business applications are designed and delivered. This is an exciting time to join our group Customer Experience (CXP) and work on something highly strategic to Microsoft. The goal of CXP Engineering is to build the next generation of our applications running on Dynamics 365, AI, Copilot, and several other Microsoft cloud services to drive AI transformation across Marketing, Sales, Services and Support organizations within Microsoft. We innovate quickly and collaborate closely with our partners and customers in an agile, high-energy environment. Leveraging the scalability and value from Azure & Power Platform, we ensure our solutions are robust and efficient. Our organization’s implementation acts as reference architecture for large companies and helps drive product capabilities. If the opportunity to collaborate with a diverse engineering team, on enabling end-to-end business scenarios using cutting-edge technologies and to solve challenging problems for large scale 24x7 business SaaS applications excite you, please come and talk to us! We are hiring a passionate Principal SW Engineering Manager to lead a team of highly motivated and talented software developers building highly scalable data platforms and deliver services and experiences for empowering Microsoft’s customer, seller and partner ecosystem to be successful. This is a unique opportunity to use your leadership skills and experience in building core technologies that will directly affect the future of Microsoft on the cloud. In this position, you will be part of a fun-loving, diverse team that seeks challenges, loves learning and values teamwork. You will collaborate with team members and partners to build high-quality and innovative data platforms with full stack data solutions using latest technologies in a dynamic and agile environment and have opportunities to anticipate future technical needs of the team and provide technical leadership to keep raising the bar for our competition. We use industry-standard technology: C#, JavaScript/Typescript, HTML5, ETL/ELT, Data warehousing, and/ or Business Intelligence Development. Responsibilities As a leader of the engineering team, you will be responsible for the following: Build and lead a world class data engineering team. Passionate about technology and obsessed about customer needs. Champion data-driven decisions for features identification, prioritization and delivery. Managing multiple projects, including timelines, customer interaction, feature tradeoffs, etc. Delivering on an ambitious product and services roadmap, including building new services on top of vast amount data collected by our batch and near real time data engines. Design and architect internet scale and reliable services. Leveraging machine learning(ML) models knowledge to select appropriate solutions for business objectives. Communicate effectively and build relationship with our partner teams and stakeholders. Help shape our long-term architecture and technology choices across the full client and services stack. Understand the talent needs of the team and help recruit new talent. Mentoring and growing other engineers to bring in efficiency and better productivity. Experiment with and recommend new technologies that simplify or improve the tech stack. Work to help build an inclusive working environment. Qualifications Basic Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. 12+ years of experience of building high scale enterprise Business Intelligence and data engineering solutions. 3+ years of management experience leading a high-performance engineering team. Proficient in designing and developing distributed systems on cloud platform. Must be able to plan work, and work to a plan adapting as necessary in a rapidly evolving environment. Experience using a variety of data stores, including data ETL/ELT, warehouses, RDBMS, in-memory caches, and document Databases. Experience using ML, anomaly detection, predictive analysis, exploratory data analysis. A strong understanding of the value of Data, data exploration and the benefits of a data-driven organizational culture. Strong communication skills and proficiency with executive communications Demonstrated ability to effectively lead and operate in cross-functional global organization Preferred Qualifications: Prior experience as an engineering site leader is a strong plus. Proven success in recruiting and scaling engineering organizations effectively. Demonstrated ability to provide technical leadership to teams, with experience managing large-scale data engineering projects. Hands-on experience working with large data sets using tools such as SQL, Databricks, PySparkSQL, Synapse, Azure Data Factory, or similar technologies. Expertise in one or more of the following areas: AI and Machine Learning. Experience with Business Intelligence or data visualization tools, particularly Power BI, is highly beneficial. #BICJobs Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

(ReactJS UI, Postgres Data Modeling, Spark Analytics work). Team lead with experience leading agile dev teams, guiding them and being a role model to developers in the team (lead by example) Must be highly hands-on lead, writing code, leading team, reviewing other developers code, working closely with the architect to implement the proposed design 5+ years of experience writing enterprise grade Java or Python code (should be highly proficient) Solid understanding of data structures and fundamental algorithms (sort, select, search, queue) Solid understanding of distributed computing and/or massively parallel processing concepts and frameworks (at least one): Spark, Kafka, MapReduce, Impala 2+ years of experience writing Spark and Spark SQL routines to process large volumes of data 2+ years of experience building enterprise data platforms: Data Ingestion, Data Lake, ETL, Data Warehouse, Data Access Patterns/APIs, Reporting 2+ years of experience building ETL or ELT routines in one or more of the technologies: Spark, Kafka Decent data warehousing and data modeling skills Experience working in Linux Spring Boot/APIs implementation experience is a nice to have Azure experience is a nice to have Databricks experience is a nice to have

Posted 2 weeks ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Job Description: The AI/ML engineer role requires a blend of expertise in machine learning operations (MLOps), ML Engineering, Data Science, Large Language Models (LLMs), and software engineering principles. Skills you'll need to bring: Experience building production-quality ML and AI systems. Experience in MLOps and real-time ML and LLM model deployment and evaluation. Experience with RAG frameworks and Agentic workflows valuable. Proven experience deploying and monitoring large language models (e.g., Llama, Mistral, etc.). Improve evaluation accuracy and relevancy using creative, cutting-edge techniques from both industry and new research Solid understanding of real-time data processing and monitoring tools for model drift and data validation. Knowledge of observability best practices specific to LLM outputs, including semantic similarity, compliance, and output quality. Strong programming skills in Python and familiarity with API-based model serving. Experience with LLM management and optimization platforms (e.g., LangChain, Hugging Face). Familiarity with data engineering pipelines for real-time input-output logging and analysis. Qualifications: Experience working with common AI-related models, frameworks and toolsets like LLMs, Vector Databases, NLP, prompt engineering and agent architectures. Experience in building AI and ML solutions. Strong software engineering skills for the rapid and accurate development of AI models and systems. Prominent in programming language like Python. Hands-on experience with technologies like Databricks, and Delta Tables. Broad understanding of data engineering (SQL, NoSQL, Big Data), Agile, UX, Cloud, software architecture, and ModelOps/MLOps. Experience in CI/CD and testing, with experience building container-based stand-alone applications using tools like GitHub, Jenkins, Docker and Kubernetes Responsibilities: Participate in research and innovation of data science projects that have impact to our products and customers globally. Apply ML expertise to train models, validates the accuracy of the models, and deploys the models at scale to production. Apply best practices in MLOps, LLMOps, Data Science, and software engineering to ensure the delivery of clean, efficient, and reliable code. Aggregate huge amounts of data from disparate sources to discover patterns and features necessary to automate the analytical models. About Company At Improva , we leverage the power of human creativity and artificial intelligence to create innovative software solutions that drive business growth and digital transformation. Our approach combines AI-driven automation with human insight to solve complex challenges, optimize operations, and unlock new opportunities for our clients. We help businesses stay ahead in an ever-changing digital landscape, ensuring they achieve long-term success. If you're looking to be part of a dynamic team that’s passionate about innovation and making a real impact, Improva is the place for you.

Posted 2 weeks ago

Apply

6.0 years

7 - 8 Lacs

Hyderābād

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description The Azure Big Data Engineer is an important role where you are responsible for designing, implementing, and managing comprehensive big data solutions on the Azure platform. You will report to the Senior Manager and require you to work hybrid 2 days WFO in Hyderabad Responsibilities: Design, implement, and maintain scalable and reliable data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Microsoft Fabric Develop, configure, and optimize data lakes and warehouses on Azure using services like Azure Data Lake Storage (ADLS), Azure Lakehouse, Warehouse and monitor data pipelines for performance, scalability, and reliability. Collaborate with data scientists, architects, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Ensure that data is secure and meets all regulatory compliance standards, including role-based access control (RBAC) and data encryption in Azure environments. Develop and configure monitoring and alerting mechanisms to proactively enhance performance and optimize data systems. Troubleshoot and resolve data-related issues in a timely manner. Produce clear, concise technical documentation for all developed solutions Requirements: Experience with SSIS, SQL Jobs, BIDS & ADF Experience with Azure services (Microsoft Fabric, Azure Synapse, Azure SQL Database, Azure Key Vault, etc.). Proficiency in Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics and Azure Data Lake Storage. Experience in data modeling, data architecture, and implementing ETL/ELT solutions Proficiency in SQL and familiarity with other programming languages such as Python or Scala. Knowledge of data modeling, data warehousing, and big data technologies. Experience with data governance and security best practices. Qualifications Bachelors in computer science or related field, Masters preferred 6+ years of professional data ingestion experience with ETL/ELT tools like SSIS, ADF, Synapse 2+ years of Azure cloud experience Bachelors in computer science or related field, Masters Preferred: Experience with Microsoft Fabric and Azure Synapse. Understand ML/AI concepts. Azure certifications Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. #LI-Hybrid Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Hyderābād

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc. Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs. Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 4+ years of overall experience in Data & Analytics engineering 4+ years of experience working with Azure, Databricks, and ADF, Data Lake Solid experience working with data platforms and products using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts - E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #NIC

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderābād

On-site

We are seeking a Data Science & Optimization Engineer to develop and implement advanced predictive models and optimization solutions. The ideal candidate will have expertise in predictive modeling, integer programming, Python development, and cloud-based data processing. This role will involve working with large datasets, solving complex optimization problems (e.g., bin packing, TSP), and managing cloud infrastructure for scalable solutions. Key Responsibilities: Develop and implement predictive models using statistical methods (e.g., Bayesian models). Solve optimization problems such as bin packing, TSP, and clustering using integer programming. Utilize Gurobi (nice-to-have) or similar solvers for advanced optimization techniques. Develop Python-based solutions using Git/Poetry for code and library management. Work with data processing libraries such as Pandas, Polars, and others. Deploy and manage data pipelines on Databricks and Azure Blob Storage. Monitor and troubleshoot pipelines, logs, and cloud resources. Implement DevOps best practices (nice-to-have) for automation and CI/CD workflows. Utilize Power Apps/Power Automate for workflow automation and business process improvement. Ensure cloud cost optimization and performance tuning for scalable architectures. Required Skills & Qualifications: Strong experience in predictive modeling and statistical techniques (Bayesian modeling preferred). Hands-on experience with integer programming and clustering methods. Proficiency in Python, including experience with Git/Poetry for code and dependency management. Expertise in data processing libraries such as Pandas, Polars, or equivalent. Familiarity with Azure cloud services, Databricks, and Azure Blob Storage. Ability to read and analyze logs for debugging and performance monitoring. Experience with cloud management and optimizing resources. Knowledge of monitoring pipelines and troubleshooting issues. Strong problem-solving skills and ability to work with large-scale datasets. Preferred Qualifications: Experience with Gurobi or other mathematical optimization solvers. Exposure to DevOps practices, including CI/CD pipelines and automation. Familiarity with Power Apps/Power Automate for process automation. Strong background in cloud cost management and performance tuning

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurgaon

On-site

Excited to grow your career? We value our talented workforce, and whenever possible strive to help our employees grow professionally. If you think this position is right for you, we encourage you to speak to your Leader and go ahead and apply! Our people make all the difference in our success. About the role: Permanent position that can be based in Delhi (India), or Ulaanbaatar (Mongolia) We are looking for a Senior Analyst People Analytics to be responsible for the maintenance and sustainability of People Data and Insights products and data solutions that enable the People Function to make well-informed, evidence-based decisions. This role is a great opportunity for an experienced individual to support the group’s focus on delivering data & analysis to drive decision making, help foster stronger partnership with internal customers and achieve business excellence. We are looking for an enterprising person with the ability to derive actionable insights from multiple data sources, love for data and statistics, and a willingness to learn and grow. Reporting to the People Data Solutions Lead and working in a collaborative community within the People Data and Insights division, you will (but not limited to): As a system support team member (People Data Solutions team): Support the maintenance and sustainability of the People Data & Insights products across People Insights, Data Management, Data Science, Data Governance, and Surveys & Research teams Manage and prioritize support requests, ensuring timely resolution of incidents and requests Provide support to the human resources team with ongoing data and reporting activities and special projects as needed Act as a primary point of contact for end-users and stakeholders to provide guidance, advice, and support on People Data & Insights products in production Ensure all support processes and procedures are in place and followed to ensure consistent and efficient support Identify and implement improvements to support processes and tools to increase the efficiency and effectiveness of the People Data Solutions team Actively get involved and monitor the transition process of the data solutions from development to operational Continuously monitor the performance and availability of operational data solutions and products, taking proactive steps to prevent incidents and resolve issues Get involved in maintenance of the HR Lakehouse (Databricks) and data solutions built within it As a member of the People Data and Insights team member: Discover issues with data accuracy, caused by system and human errors, provide recommendations for improvement Identify data quality and integrity issues to discover fit for purpose data sets. Ensure compliance with human resource reporting quality standards Maintain and implement data governance and confidentiality framework to protect employee data Ensure proper source control, document best practices and quality assurance processes are implemented and followed to maintain resilience & process integrity Collaborate with internal development teams to resolve complex issues and provide feedback on application design and development Continuously learn and get involved in development of data solutions of the People Data and Insights team whenever needed and possible Get involved in projects when and wherever necessary What you’ll bring A commitment to the safety of yourself and your team Overall, 2-4 years of experience in a global organization with multi-cultural discipline Experience of working in the high-performance data engineering and analytics team environment Knowledge (mid to expert level) in data extraction and transformation experience using common systems – for example Workday, SAP BW, Databricks, SQL databases, AWS, Cloud services, and others Understanding process documentation principles and skilled in version control (e.g., GitHub) Knowledge and experience (mid to high level) in data visualization tools such as Power BI, Excel (Pivot tables, analytical functions, macros), Tableau Advanced and working knowledge of SQL, Python, PySpark Communication & writing skills – ability to tell the story behind the data Diligence and attention to details • Ability to manage and deliver routine work on strict timelines along with special business projects It will also be beneficial if you have. Working experience in working with Human Resources data and data models, as well as an understanding of data security and data privacy Working experience (mid to expert level) in data extraction and transformation experience using common systems – for example Workday, SAP BW, Databricks, SQL databases, AWS, Cloud services, and others Proven working knowledge of Python, PySpark, SQL Hands-on experience with Databricks, AWS/Azure, Terraform, especially in defining and maintaining ETL pipelines and infrastructure as code Familiarity with basic/advanced machine learning algorithms and underlying statistical techniques Experience in stakeholder and customer management. Salary Band Band K Every Voice Matters At Rio Tinto, we particularly welcome and encourage applications from Indigenous Peoples, women, the LGBTQIA+ community, mature workers, people with disabilities and people from different cultural backgrounds. We are committed to an inclusive environment where people feel comfortable to be themselves. We want our people to feel that all voices are heard, all cultures respected and that a variety of perspectives are not only welcome – they are essential to our success. We treat each other fairly and with dignity regardless of race, gender, nationality, ethnic origin, religion, age, sexual orientation or anything else that makes us different.

Posted 2 weeks ago

Apply

15.0 years

0 Lacs

Gurgaon

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Contribute to the design and architecture for platform products and take an enterprise view to solve for enterprise needs Serve as the Cloud subject matter expert on Cloud, Azure platform, and many other Technologies like Azure Databricks Serve as solution architect for broad areas in cloud and data for new application developments or migration of existing infrastructure and applications to Cloud Generate architecture solution leveraging public / private/ or hybrid cloud deployment patterns using Optum best practices Collaborate with IT teams and business partners to understand business requirements and translate them into cloud solutions Ensure the security, scalability, and reliability of cloud infrastructure Stay up-to-date with the latest cloud technologies and trends Develop disaster recovery and business continuity plans for cloud infrastructure Facilitate strategy / architecture discussions with clients and other stakeholders Design and prepare client ready material and present to the clients, executives Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field 15+ years of experience in IT developing and architecting software solutions 8+ years of experience in cloud architecture and cloud services (possibly Azure) 5+ years of solid application modernization – digital transformation experience Solid hand-on experience on Databricks and possibly be proficient in Data handling and streaming Experience in cloud automation, cloud monitoring, and Cloud optimization Experience identifying and remediating security vulnerabilities in cloud infrastructure Deep experience in areas adjacent to cloud like data engineering, AI, process improvement, etc. Deep experience in leading complex and high visibility projects Deep experience in developing cloud strategy, cloud data modernization strategy, architecture, and infrastructure roadmap for client organizations Demonstrated experience in leading and managing large-scale, complex architectural projects from conception to implementation Solid knowledge of Azure Cloud architecture, infrastructure & patterns Knowledge of the Healthcare domain or healthcare experience Extensive knowledge of software architecture principles, design patterns, and best practices Broad understanding of Cloud based security solutions and familiar with Healthcare specific regulatory specifications like HIPPA, PHI, PII Familiar to Agile environment Exposure to AI/ML Technologies Proven solid communication and collaboration skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Delhi

On-site

Job requisition ID :: 85395 Date: Jul 3, 2025 Location: Delhi Designation: Consultant Entity: Deloitte South Asia LLP What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential The Team Deloitte’s Engg practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Work you’ll do Roles: Databricks Data Engineering – Senior Consultant We are seeking highly skilled Databricks Data Engineers to join our data modernization team. You will play a pivotal role in designing, developing, and maintaining robust data solutions on the Databricks platform. Your experience in data engineering, along with a deep understanding of Databricks, will be instrumental in building solutions to drive data-driven decision-making across a variety of customers. Mandatory Skills: Databricks, Spark, Python / SQL Responsibilities Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices. Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions. Develop data models and schemas to support reporting and analytics needs. Ensure data quality, integrity, and security by implementing appropriate checks and controls. Monitor and optimize data processing performance, identifying, and resolving bottlenecks. Stay up to date with the latest advancements in data engineering and Databricks technologies. Qualifications Bachelor’s or master’s degree in any field 6-9 years of experience in designing, implementing, and maintaining data solutions on Databricks Experience with at least one of the popular cloud platforms – Azure, AWS or GCP Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes Knowledge of data warehousing and data modelling concepts Experience with Python or SQL Experience with Delta Lake Understanding of DevOps principles and practices Excellent problem-solving and troubleshooting skills Strong communication and teamwork skills Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.

Posted 2 weeks ago

Apply

6.0 years

1 - 1 Lacs

Chhattisgarh

Remote

With a company culture rooted in collaboration, expertise and innovation, we aim to promote progress and inspire our clients, employees, investors and communities to achieve their greatest potential. Our work is the catalyst that helps others achieve their goals. In short, We Enable Possibility℠. Responsibilities Strategic Analytics is an implementation focused team. You will work to establish strong relationships with business partners to understand the problems they are trying to solve and work to implement the best-suited analytics solutions. Build predictive models using advanced analytics techniques including text analytics and machine learning. Develop powerful insights and tools using a variety of analytical techniques and technologies. Support the implementation of the analytics solutions that you develop Establish robust monitoring processes for implemented solutions and provide regular feedback to business partners Collaborate in cross-functional teams and share ideas to solve complex business problems. Discover, explore, and analyze internal and external datasets for the purpose of developing robust analytics solutions. Guide, support, mentor and develop the growing team of predictive modelers and data scientists Advance the analytics team practices, processes, and documentation and continue to enhance existing tools and infrastructure in partnership with the data engineering, implementation engineering, data visualization, and data science teams Required Skills/Experience 6+ years’ experience in analytics role in P/C insurance, preferably commercial lines. Detailed understanding of statistical theory, predictive modeling, data mining and other advanced analytical techniques. Strong SQL and Python skills are a must. Experience with other technical tools such as R and visualization tools such as Power BI is a plus Proven hands-on experience developing machine learning models in a claim setting: decision trees, clustering, text mining, random forests, survival analysis, etc. Track record of delivering significant business value from small or non-standard data sets. Data analytics experience in mining, analyzing, and interpreting data through various statistical techniques. Evidence of analytical skills, problem solving, business acumen, and strong critical thinking. Excellent verbal and written communications skills; ability to convey complex concepts to people across the organization. Exceptional teamwork skills required to play a key role in cross-functional teams. Ability to collaborate and build trusting relationships with business partners. Desired Skills/Experience P&C insurance claim experience, including knowledge and understanding of property and casualty products. Experience with Claim models supporting TPA driven operating model. Experience developing models in cloud-based tools in Azure such as Databricks, Snowflake, Github. Experience with cloud-based model deployment via MLOps Experience prompt engineering and evaluating results of LLMs Proficiency in BI Tools (e.g. Power BI) P/C commercial lines experience. Education Bachelor’s degree in data science, analytics, statistics, actuarial science, mathematics, engineering or similar quantitative fields; or significant experience in data analytics. #LI-LH1 #LI-REMOTE For individuals assigned or hired to work in the location(s) indicated below, the base salary range is provided. Range is as of the time of posting. Position is incentive eligible. $135,000 - $175,000/year Total individual compensation (base salary, short & long-term incentives) offered will take into account a number of factors including but not limited to geographic location, scope & responsibilities of the role, qualifications, talent availability & specialization as well as business needs. The above pay range may be modified in the future. Click here to learn more on available benefits. Do you like solving complex business problems, working with talented colleagues and have an innovative mindset? Arch may be a great fit for you. If this job isn’t the right fit but you’re interested in working for Arch, create a job alert! Simply create an account and opt in to receive emails when we have job openings that meet your criteria. Join our talent community to share your preferences directly with Arch’s Talent Acquisition team. 10200 Arch Capital Services LLC

Posted 2 weeks ago

Apply

6.0 - 9.0 years

0 Lacs

Chennai

On-site

Job requisition ID :: 83306 Date: Jul 3, 2025 Location: Chennai Designation: Consultant Entity: Deloitte South Asia LLP What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services, Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential The Team Deloitte’s Engg practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Work you’ll do Roles: Databricks Data Engineering – Senior Consultant We are seeking highly skilled Databricks Data Engineers to join our data modernization team. You will play a pivotal role in designing, developing, and maintaining robust data solutions on the Databricks platform. Your experience in data engineering, along with a deep understanding of Databricks, will be instrumental in building solutions to drive data-driven decision-making across a variety of customers. Mandatory Skills: Databricks, Spark, Python / SQL Responsibilities Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices. Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions. Develop data models and schemas to support reporting and analytics needs. Ensure data quality, integrity, and security by implementing appropriate checks and controls. Monitor and optimize data processing performance, identifying, and resolving bottlenecks. Stay up to date with the latest advancements in data engineering and Databricks technologies. Qualifications Bachelor’s or master’s degree in any field 6-9 years of experience in designing, implementing, and maintaining data solutions on Databricks Experience with at least one of the popular cloud platforms – Azure, AWS or GCP Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes Knowledge of data warehousing and data modelling concepts Experience with Python or SQL Experience with Delta Lake Understanding of DevOps principles and practices Excellent problem-solving and troubleshooting skills Strong communication and teamwork skills Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Senior Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you will grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you are applying to. Check out recruiting tips from Deloitte professionals.

Posted 2 weeks ago

Apply

3.0 - 9.0 years

0 Lacs

Coimbatore

On-site

Job requisition ID :: 83806 Date: Jul 3, 2025 Location: Coimbatore Designation: Consultant Entity: Deloitte South Asia LLP Technology & Transformation EAD: Engineering – Databricks Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Work you’ll do We are seeking highly skilled Databricks Data Engineers to join our data modernization team. You will play a pivotal role in designing, developing, and maintaining robust data solutions on the Databricks platform. Your experience in data engineering, along with a deep understanding of Databricks, will be instrumental in building solutions to drive data-driven decision-making across a variety of customers. Mandatory Skills: Databricks, Spark, Python / SQL Responsibilities Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake. Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices. Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions. Develop data models and schemas to support reporting and analytics needs. Ensure data quality, integrity, and security by implementing appropriate checks and controls. Monitor and optimize data processing performance, identifying, and resolving bottlenecks. Stay up to date with the latest advancements in data engineering and Databricks technologies. Qualifications Bachelor’s or master’s degree in any field 3-9 years of experience in designing, implementing, and maintaining data solutions on Databricks Experience with at least one of the popular cloud platforms – Azure, AWS or GCP Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes Knowledge of data warehousing and data modelling concepts Experience with Python or SQL Experience with Delta Lake Understanding of DevOps principles and practices Excellent problem-solving and troubleshooting skills Strong communication and teamwork skills Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Consultant across our organization: Develop high-performing people and teams through challenging and meaningful opportunities Deliver exceptional client service; maximize results and drive high performance from people while fostering collaboration across businesses and borders Influence clients, teams, and individuals positively, leading by example and establishing confident relationships with increasingly senior people Understand key objectives for clients and Deloitte; align people to objectives and set priorities and direction. Acts as a role model, embracing and living our purpose and values, and recognizing others for the impact they make How you’ll grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help build world-class skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development Programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Centre. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ͏ Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ͏ 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ͏ 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ͏ Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Title: Data Scientist Location: Bangalore Reporting to: Manager- Analytics/ Senior Manager-Analytics Purpose of the role Contributing to the Data Science efforts of AB InBevʼs global non-commercial analytics capability of Procurement Analytics. Candidate will be required to contribute and may also need to guide the DS team staffed on the area and assess the efforts required to scale and standardize the use of Data Science across multiple ABI markets KEY TASKS AND ACCOUNTABILITIES Understand the business problem and translate that to an analytical problem; participate in the solution design process. Manage the full AI/ML lifecycle, including data preprocessing, feature engineering, model training, validation, deployment, and monitoring. Develop reusable and modular Python code adhering to OOP (Object-Oriented Programming) principles. Design, develop, and deploy machine learning models into production environments on Azure. Collaborate with data scientists, software engineers, and other stakeholders to meet business needs. Ability to communicate findings clearly to both technical and business stakeholders. 3. Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) B.Tech /BE/ Masters in CS/IS/AI/ML Previous Work Experience Required Minimum 3 years of relevant experience Technical Skills Required Must Have Strong expertise in Python, including advanced knowledge of OOP concepts. Exposure to AI/ML methodologies with a previous hands-on experience in ML concepts like forecasting, clustering, regression, classification, optimization, deep learning , NLP using Python Solid understanding of GenAI concepts and experience in Prompt Engineering and RAG Experience with version control tools such as Git. Consistently display an intent for problem solving Strong communication skills (vocal and written) Ability to effectively communicate and present information at various levels of an organization Good To Have Preferred industry exposure in CPG and experience of working in the domain of Procurement Analytics Product building experience would be a plus Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform Experience with Airflow for orchestrating and automating workflows Familiarity with MLOPS and containerization tools like Docker would be plus. Other Skills Required Passion for solving problems using data Detail oriented, analytical and inquisitive Ability to learn on the go Ability to work independently and with others We dream big to create future with more cheers

Posted 2 weeks ago

Apply

2.0 - 3.0 years

5 - 7 Lacs

India

On-site

Experience - 2 to 3 years Required skills - Azure Cloud Platform, Python with OOPS concept, ML Algorithms (Supervised/Unsupervised, Forecasting, etc.), Databricks including Unity Catalogue, Airflow, MLOPS, ML Lifecycle, CI/CD Pipeline, Flask/Fast API Job Types: Full-time, Permanent Pay: ₹45,000.00 - ₹60,000.00 per month Schedule: Day shift Work Location: In person

Posted 2 weeks ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers to levels they cannot achieve anywhere else. This is a world of more possibilities, more innovation, more openness in a cloud-enabled world. The Business & Industry Copilots group is a rapidly growing organization that is responsible for the Microsoft Dynamics 365 suite of products, Power Apps, Power Automate, Dataverse, AI Builder, Microsoft Industry Solution and more. Microsoft is considered one of the leaders in Software as a Service in the world of business applications and this organization is at the heart of how business applications are designed and delivered. This is an exciting time to join our group Customer Experience (CXP) and work on something highly strategic to Microsoft. The goal of CXP Engineering is to build the next generation of our applications running on Dynamics 365, AI, Copilot, and several other Microsoft cloud services to drive AI transformation across Marketing, Sales, Services and Support organizations within Microsoft. We innovate quickly and collaborate closely with our partners and customers in an agile, high-energy environment. Leveraging the scalability and value from Azure & Power Platform, we ensure our solutions are robust and efficient. Our organization’s implementation acts as reference architecture for large companies and helps drive product capabilities. If the opportunity to collaborate with a diverse engineering team, on enabling end-to-end business scenarios using cutting-edge technologies and to solve challenging problems for large scale 24x7 business SaaS applications excite you, please come and talk to us! We are hiring a passionate Principal SW Engineering Manager to lead a team of highly motivated and talented software developers building highly scalable data platforms and deliver services and experiences for empowering Microsoft’s customer, seller and partner ecosystem to be successful. This is a unique opportunity to use your leadership skills and experience in building core technologies that will directly affect the future of Microsoft on the cloud. In this position, you will be part of a fun-loving, diverse team that seeks challenges, loves learning and values teamwork. You will collaborate with team members and partners to build high-quality and innovative data platforms with full stack data solutions using latest technologies in a dynamic and agile environment and have opportunities to anticipate future technical needs of the team and provide technical leadership to keep raising the bar for our competition. We use industry-standard technology: C#, JavaScript/Typescript, HTML5, ETL/ELT, Data warehousing, and/ or Business Intelligence Development. Responsibilities As a leader of the engineering team, you will be responsible for the following: Build and lead a world class data engineering team. Passionate about technology and obsessed about customer needs. Champion data-driven decisions for features identification, prioritization and delivery. Managing multiple projects, including timelines, customer interaction, feature tradeoffs, etc. Delivering on an ambitious product and services roadmap, including building new services on top of vast amount data collected by our batch and near real time data engines. Design and architect internet scale and reliable services. Leveraging machine learning(ML) models knowledge to select appropriate solutions for business objectives. Communicate effectively and build relationship with our partner teams and stakeholders. Help shape our long-term architecture and technology choices across the full client and services stack. Understand the talent needs of the team and help recruit new talent. Mentoring and growing other engineers to bring in efficiency and better productivity. Experiment with and recommend new technologies that simplify or improve the tech stack. Work to help build an inclusive working environment. Qualifications Basic Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. 12+ years of experience of building high scale enterprise Business Intelligence and data engineering solutions. 3+ years of management experience leading a high-performance engineering team. Proficient in designing and developing distributed systems on cloud platform. Must be able to plan work, and work to a plan adapting as necessary in a rapidly evolving environment. Experience using a variety of data stores, including data ETL/ELT, warehouses, RDBMS, in-memory caches, and document Databases. Experience using ML, anomaly detection, predictive analysis, exploratory data analysis. A strong understanding of the value of Data, data exploration and the benefits of a data-driven organizational culture. Strong communication skills and proficiency with executive communications Demonstrated ability to effectively lead and operate in cross-functional global organization Preferred Qualifications Prior experience as an engineering site leader is a strong plus. Proven success in recruiting and scaling engineering organizations effectively. Demonstrated ability to provide technical leadership to teams, with experience managing large-scale data engineering projects. Hands-on experience working with large data sets using tools such as SQL, Databricks, PySparkSQL, Synapse, Azure Data Factory, or similar technologies. Expertise in one or more of the following areas: AI and Machine Learning. Experience with Business Intelligence or data visualization tools, particularly Power BI, is highly beneficial. #BICJobs Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

India

On-site

We are seeking an experienced Salesforce Architect to lead the design, development, and optimization of our Salesforce ecosystem. The ideal candidate has deep expertise in scalable architecture, strong experience integrating Salesforce with AI/ML and analytics platforms, and a passion for building robust, future-proof CRM solutions. Key Responsibilities Architect and implement scalable, high-performance Salesforce solutions aligned with client’s product and go-to-market strategy. Lead design of Salesforce Sales Cloud, Service Cloud, and Experience Cloud components. Collaborate closely with SalesOps, RevOps, Customer Success, and Product teams to translate business goals into technical solutions. Define and maintain Salesforce architectural standards, governance, and best practices. Integrate Salesforce with third-party platforms, including marketing automation, AI/ML APIs, and data lakes. Guide development team in implementation (Apex, Lightning Web Components, Flows, etc.). Evaluate and integrate AI capabilities into the Salesforce platform (e.g., predictive lead scoring, recommendation engines, workflow automation). Optimize data model and workflows for analytics, reporting, and scale. Ensure data integrity, security, and compliance (GDPR, SOC2, etc.). Required Qualifications 8+ years of hands-on Salesforce experience, with at least 5 years in an architecture or technical lead role. Deep understanding of Salesforce architecture, governor limits, API usage, and Lightning framework. Experience with Sales Cloud, Service Cloud, and Experience Cloud. Proven track record of integrating Salesforce with AI/ML or analytics tools (e.g., Snowflake, Databricks, Segment, or internal AI models). Proficiency in Apex, Lightning Web Components, SOQL, and Flow. Experience working in high-growth SaaS or enterprise tech environments. Strong understanding of enterprise integration patterns (MuleSoft, Kafka, Workato, etc.). Salesforce Certified Application Architect or System Architect preferred.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Preferred Education Master's Degree Required Technical And Professional Expertise Total 3-5+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred Technical And Professional Experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Job Title: Senior Data Scientist/Team Lead Job Summary: We are seeking a Senior Data Scientist with hand-on experience in leveraging data, machine learning, statistics and AI technologies to generate insights and inform decision-making. You will work on large-scale data ecosystems and lead a team to implement data-driven solutions. Key Responsibilities: Lead and deliver large-scale DS/ML end to end projects across multiple industries and domains Liaison with on-site and client teams to understand various business problem statements, use cases and project requirements Lead a team of Data Engineers, ML/AI Engineers, DevOps, and other Data & AI professionals to deliver projects from inception to implementation Utilize maths/stats, AI, and cognitive techniques to analyze and process data, predict scenarios, and prescribe actions. Assist and participate in pre-sales, client pursuits and proposals Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members Qualifications: 6-10 years of relevant hands-on experience in Data Science, Machine Learning, Statistical Modeling Bachelor’s or Master’s degree in a quantitative field Led a 3-5 member team on multiple end to end DS/ML projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, PySpark and SQL, and frameworks such as Numpy, Pandas, Scikit-learn, etc. Expertise in Classification, Regression, Time series, Decision Trees, Optimization, etc. Hands on knowledge of Docker containerization, GIT, Tableau or PowerBI Model deployment on Cloud or On-prem will be an added advantage Familiar with Databricks, Snowflake, or Hyperscalers (AWS/Azure/GCP/NVIDIA) Should follow research papers, comprehend and innovate/present the best approaches/solutions related to DS/ML AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300022

Posted 2 weeks ago

Apply

7.0 - 12.0 years

1 - 2 Lacs

Hyderabad

Hybrid

Role & responsibilities 5 to 8 years of experience in data engineering or a related field. At least 2 years of hands-on experience with Databricks, PySpark, Azure Synapse, and cloud platforms (preferably Azure). Technical Skills : Strong expertise in Python/PySpark programming applied to data engineering. Solid experience with cloud platforms (Azure preferred, AWS/GCP) and their data services. Advanced SQL skills and familiarity with relational and NoSQL databases (e.g., MS SQL, MySQL, PostgreSQL). Strong understanding of CI/CD pipelines and automation practices in data engineering. Experience with large-scale data processing on cloud platforms. Soft Skills : Strong analytical and problem-solving skills. Excellent communication skills and the ability to work collaboratively across teams. High attention to detail with a proactive approach to improving systems and processes. Education: Bachelors or masters degree in computer science, Engineering, Information Systems, or a related field. Key Responsibilities : Data Infrastructure and Pipeline Development: Design, develop, and maintain complex ETL/ELT pipelines using Azure Synapse. Design, build, and maintain data pipelines and APIs in the cloud environment, with a focus on Azure cloud Platform. Optimize data pipelines for performance, scalability, and cost efficiency. Implement data governance, quality, and security best practices throughout data workflows. Cloud Platform Management : Design and manage cloud-based data infrastructure on Azure and other cloud platforms. Utilize cloud-native tools and services to enhance data storage and processing capabilities. Build and manage CI/CD pipelines for data engineering projects to ensure smooth deployments and automation. Programming and Automation : Write and maintain high-quality, reusable code in Azure Synapse environments for data processing and automation.

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Role Overview We are looking for a confident Security Engineer/Researcher position with experience in IT-Security for our Core Research labs in India. McAfee believes that no one person, product, or organization can fight cybercrime alone. It's why we rebuilt McAfee around the idea of working together. Life at McAfee is full of possibility. You’ll have the freedom to explore challenges, take smart risks, and reach your potential in one of the fastest-growing industries in the world. You’ll be part of a team that supports and inspires you. This is a hybrid position based in Bangalore. You must be within a commutable distance from the location. You will be required to be onsite on an as-needed basis; when not working onsite, you will work remotely from your home location About The Role Understand threat telemetry trends and identify patterns to reduce time to detect. Develop automation to harvest malware threat intelligence from various sources such as product telemetry, OSINT, Dark Web monitoring, spam monitoring, etc. Develop early identification and alert systems for threats based on various online platforms and product telemetry. Utilize various data mining tools that analyze data inline based on intelligence inputs. Analyze malware communication and techniques to find Indicators of Compromise (IOC) or Indicators of Attack (IOA). Authoring descriptions for malware either via McAfee Virus Information Library, Threat Advisories, Whitepapers, or Blogs. About You You should have 7+ years of experience as a security/threat/malware analyst. Programming Skills—Knowledge of programming languages like Python and its packages like NumPy, Matplotlib, and Seaborn is desirable. Data source accesses like Spark and SQL are desirable. Machine Learning knowledge is added advantage. Familiarity with UI & dashboard tools like Jupyter and Databricks is an added advantage. Excellent Communication Skills—It is incredibly important to describe findings to a technical and non-technical audience. Company Overview McAfee is a leader in personal security for consumers. Focused on protecting people, not just devices, McAfee consumer solutions adapt to users’ needs in an always online world, empowering them to live securely through integrated, intuitive solutions that protects their families and communities with the right security at the right moment. Company Benefits And Perks We work hard to embrace diversity and inclusion and encourage everyone at McAfee to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Bonus Program Pension and Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious about our commitment to diversity which is why McAfee prohibits discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Posted 2 weeks ago

Apply

2.0 - 3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: AI/GenAI Engineer Job ID: POS-13731 Primary Skill: Databricks, ADF Location: Hyderabad Experience: 3.00 Secondary skills: Python, LLM, Langchain, Vectors, and AWS Mode of Work: Work from Office Experience : 2-3 Years About The Job We are seeking a highly motivated and innovative Generative AI Engineer to join our team and drive the exploration of cutting-edge AI capabilities. You will be at forefront of developing solutions using Generative AI technologies, primarily focusing on Large Language Models (LLMs) and foundation models, deployed on either AWS or Azure cloud platforms. This role involves rapid prototyping, experimentation, and collaboration with various stakeholders to assess the feasibility and potential impact of GenAI solutions on our business challenges. If you are passionate about the potential of GenAI and enjoy hands-on building in a fast-paced environment, this is the role for you. Know Your Team At ValueMomentum’s Engineering Center , we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through a strong engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Responsibilities Develop GenAI Solutions: Develop, and rapidly iterate on GenAI solutions leveraging LLMs and other foundation models available on AWS and/or Azure platforms. Cloud Platform Implementation: Utilize relevant cloud services (e.g., AWS SageMaker, Bedrock, Lambda, Step Functions; Azure Machine Learning, Azure OpenAI Service, Azure Functions) for model access, deployment, data processing. Explore GenAI Techniques: Experiment with and implement techniques like Retrieval-Augmented Generation (RAG), evaluating the feasibility of model fine-tuning or other adaptation methods for specific PoC requirements. API Integration: Integrate GenAI models (via APIs from cloud providers, OpenAI, Hugging Face, etc.) into prototype applications and workflows. Data Handling for AI: Prepare, manage, and process data required for GenAI tasks, such as data for RAG indexes, datasets for evaluating fine-tuning feasibility, or example data for few-shot prompting. Documentation & Presentation: Clearly document PoC architectures, implementation details, findings, limitations, and results for both technical and non-technical audiences. Requirements Overall, 2-3 years of experience. Expert in Python with advance programming and concepts Solid understanding of Generative AI concepts, including LLMs, foundation models, prompt engineering, embeddings, and common architectures (e.g., RAG). Demonstrable experience working with at least one major cloud platform (AWS or Azure). Hands-on experience using cloud-based AI/ML services relevant to GenAI (e.g., AWS SageMaker, Bedrock; Azure Machine Learning, Azure OpenAI Service). Experience interacting with APIs, particularly AI/ML model APIs Bachelor’s degree in computer science, AI, Data Science or equivalent practical experience. About The Company Headquartered in New Jersey, US, ValueMomentum is the largest standalone provider of IT Services and Solutions to Insurers. Our industry focus, expertise in technology backed by R&D, and our customer-first approach uniquely position us to deliver the value we promise and drive momentum to our customers’ initiatives. ValueMomentum is amongst the top 10 insurance-focused IT services firms in North America by number of customers. Leading Insurance firms trust ValueMomentum with their Digital, Data, Core, and IT Transformation initiatives. Benefits We at ValueMomentum offer you a congenial environment to work and grow in the company of experienced professionals. Some benefits that are available to you are: Competitive compensation package. Career Advancement: Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management: Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Data Engineer Location: Pune, India (Hybrid) Type: Contract (6 Months) Experience: 5–8 Years Domain: Financial Services Work Timing: Regular Day Shift Background Check: Mandatory before onboarding Job Description: Seeking experienced Data Engineers with strong hands-on skills in SQL, Python, Azure Databricks, ADF, and PySpark. Candidates should have experience in data modeling, ETL design, big data technologies, and large-scale on-prem to cloud migrations using Azure data stack. Mandatory Skills: Azure Databricks Azure Data Factory Python PySpark Preferred Skills: Spark, Kafka Azure Synapse, Azure SQL, Azure Data Lake, Azure Cosmos DB Batch and real-time ingestion

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job description: Job Description Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. ͏ Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFP’s received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs ͏ Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor ͏ 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipro’s Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc ͏ 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: DataBricks - Data Engineering . Experience: 8-10 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies