Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 17.0 years
17 - 22 Lacs
Noida
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice under RMI – Optum Advisory umbrella. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Design and implement secure, scalable, and cost-effective cloud data architectures using cloud services such as Azure Data Factory (ADF), Azure Databricks, Azure Storage, Key Vault, Snowflake, Synapse Analytics, MS Fabric/Power BI etc. Define and lead data & cloud strategy, including migration plans, modernization of legacy systems, and adoption of new cloud capabilities Collaborate with clients to understand business requirements and translate them into optimal cloud architecture solutions, balancing performance, security, and cost Evaluate and compare cloud services (e.g., Databricks, Snowflake, Synapse Analytics) and recommend the best-fit solutions based on project needs and organizational goals Lead the full lifecycle of data platform and product implementations, from planning and design to deployment and support Drive cloud migration initiatives, ensuring smooth transition from on-premise systems while engaging and upskilling existing teams Lead and mentor a team of cloud and data engineers, fostering a culture of continuous learning and technical excellence Plan and guide the team in building Proof of Concepts (POCs), exploring new cloud capabilities, and validating emerging technologies Establish and maintain comprehensive documentation for cloud setup processes, architecture decisions, and operational procedures Work closely with internal and external stakeholders to gather requirements, present solutions, and ensure alignment with business objectives Ensure all cloud solutions adhere to security best practices, compliance standards, and governance policies Prepare case studies and share learnings from implementations to build organizational knowledge and improve future projects Building and analyzing data engineering processes and act as an SME to troubleshoot performance issues and suggesting solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions, Maven etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Identifies solutions to non-standard requests and problems Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 12+ years of overall experience in Data & Analytics engineering 10+ years of solid experience working as an Architect designing data platforms using Azure, Databricks, Snowflake, ADF, Data Lake, Synapse Analytics, Power BI etc. 10+ years of experience working with data platform or product using PySpark and Spark-SQL In-depth experience designing complex Azure architecture for various business needs & ability to come up with efficient design & solutions Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. Experience in leading team and people management Highly proficient and hands-on experience with Azure services, Databricks/Snowflake development etc. Excellent communication and stakeholder management skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Eexperience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 3 weeks ago
7.0 - 12.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 7+ years of overall experience in Data & Analytics engineering 5+ years of experience working with Azure, Databricks, and ADF, Data Lake 5+ years of experience working with data platform or product using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 3 weeks ago
3.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries ofscience and engineering to make possiblethe next generations of technology, join us to Make Possible a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers.We empower our team to push the boundaries of what is possiblewhile learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities: Supports the design and development of program methods, processes, and systems to consolidate and analyze structured and unstructured, diverse "big data" sources. Interfaces with internal customers for requirements analysis and compiles data for scheduled or special reports and analysis Supports project teams to develop analytical models, algorithms and automated processes, applying SQL understanding and Python programming, to cleanse, integrate and evaluate large datasets. Supports the timely development of products for manufacturing and process information by applying sophisticated data analytics. Able to quickly understand the requirement and create it into executive level presentation slides. Participates in the design, development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. used to drive key business decisions. Strong business & financial (P&L) acumen. Able to understand key themes, financial terms and data points to create appropriate summaries. Works with business intelligence manager and other staff to assess various reporting needs. Analyzes reporting needs and requirements, assesses current reporting in the context of strategic goals and devise plans for delivering the most appropriate reporting solutions to users. Qualification: Bachelors/Masters degree or relevant 7 - 12 years of experience as data analyst Required technical skills in SQL, Azure, Python, Databricks, Tableau (good to have) PowerPoint and Excel expertise Experience in Supply Chain domain. Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and basic knowledge of related disciplines. Business Expertise Has knowledge of best practices and how own area integrated with others; is aware of the competition and the factors that differentiate them in the market. Leadership Acts as a resource for colleagues with less experience; may lead small projects with manageable risks and resource requirements. Problem Solving Solves complex problems; takes a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information. Impact Impacts a range of customer, operational, project or service activities within own team and other related teams; works within broad guidelines and policies. Interpersonal Skills Explains difficult or sensitive information; works to build consensus. Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 20% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 3 weeks ago
8.0 - 13.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries ofscience and engineering to make possiblethe next generations of technology, join us to Make Possible a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers.We empower our team to push the boundaries of what is possiblewhile learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities Provide technical support for applications built using .Net as well as Angular, React and other open source technologies. Troubleshoot and resolve issues related to Front End, APIs and backend services. Collaborate with development teams to understand and resolve technical issues Assist in the deployment and maintenance of software applications. Ensure the performance, quality, and responsiveness of applications and apply permanent fixes to the critical and recurring issues Help maintain code quality, organization, and automation. Perform design reviews with the respective development for critical applications and provide inputs Document support processes and solutions for future reference. Stay up-to-date with the latest industry trends and technologies. Required Skills and Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 8+ years of experience in software development and support. Strong proficiency in .Net, Angular, React, Proficient in Python for backend support Familiarity in Hadoop Ecosystem as well as Databricks Experience with RESTful APIs and web services. Solid understanding of front-end technologies, including HTML5, CSS3, and JavaScript as well as Azure, AWS Strong Background in SQL Server and other relational databases Familiarity with version control systems (e.g., Git) as well as Atlassian Products for Software Development and Code Deployment Mechanisms/DevOps Best practices in hosting the applications in containerized platforms like OCP (onprem and cloud) etc Experience with open-source projects and contributions. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Certifications in relevant areas specially Microsoft will be a plus Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and knowledge of Semiconductor industry is nice to have interpersonal Skills Explains difficult or sensitive information; works to build consensus Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 3 weeks ago
6.0 - 11.0 years
8 - 14 Lacs
Hyderabad
Work from Office
Bachelors degree (Computer Science), masters degree or Technical Diploma or equivalent At least 8 years of experience in a similar role At least 5 years of experience on AWS and/or Azure At least 5 years of experience on Databricks At least 5 years of experience on multiples Azure and AWS PaaS solutions: Azure Data Factory, MSSQL, Azure storage, AWS S3, Cognitive search, CosmosDB, Event Hub, AWS glue Strong knowledge of AWS and Azure architecture design best practices Knowledge of ITIL & AGILE methodologies (certifications are a plus) Experience working with DevOps tools such as Git, CI/CD pipelines, Ansible, Azure DevOps Knowledge of Airflow, Kubernetes is an added advantage Solid understanding of Networking/Security and Linux English language on the Business Fluent level is required Curious to continuously learn and explore new approaches/technologies Able to work under pressure in a multi-vendor and multi-cultural team Flexible, agile and adaptive to change Customer-focused approach Good communication skills Analytical mind-set Innovation
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Data Engineer ensuring the smooth functioning of our applications and data systems. Your expertise in Data Ingestion, Release Management, Monitorization, Incident Review, Databricks, Azure Cloud, and Data Analysis will be instrumental in maintaining the reliability, availability, and performance of our applications and data pipelines. You will collaborate closely with cross-functional teams to support application deployments, monitor system health, analyze data, and provide timely resolutions to incidents. The ideal candidate should have a strong background in Azure DevOps, Azure Cloud specially ADF, Databricks, and AWS Cloud. List of Key Responsibilities: Implement and manage data ingestion processes to acquire data from various sources and ensure its accuracy and completeness in our systems. Collaborate with development and operations teams to facilitate the release management process, ensuring successful and efficient deployment of application updates and enhancements. Monitor the performance and health of applications and data pipelines, promptly identifying and addressing any anomalies or potential issues. Respond to incidents and service requests in a timely manner, conducting thorough incident reviews to identify root causes and implementing effective solutions to prevent recurrence. Utilize Databricks and monitoring tools to analyze application logs, system metrics, and data to diagnose and troubleshoot issues effectively. Analyze data-related issues, troubleshoot data quality problems, and propose solutions to optimize data workflows. Utilize Azure Cloud services to deploy and manage applications and data infrastructure efficiently. Document incident reports, resolutions, and support procedures for knowledge sharing and future reference. Continuously improve support processes and workflows to enhance efficiency, minimize downtime, and improve the overall reliability of applications and data systems. Stay up-to-date with the latest technologies and industry best practices related to application support, data analysis, and cloud services. Technical Knowledge: Technology Level of expertise* Priority Must Nice to have Scala X Spark X Azure Cloud Senior yes X AWS Cloud X Python X Databricks Senior yes X ADF yes X Rstudio/Rconnect Junior X
Posted 3 weeks ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Immediate Openings on DotNet Developer _ Bangalore _Contract Skill: DotNet Developer Notice Period: Immediate . Employment Type: Contract Job Description Bachelor's degree in computer science, Information Systems, or other relevant subject area or equivalent experience ' 8-10+ years of experience in the skills .net framework,.net core,asp.net, vb.net, html, web serv ce,web api,SharePoint,power automate,Microsoft apps,mysql,sqlserver Client and Server Architecture and maintain code base via GitHub would be added benefit Robost SQL knowledge such as complex nested queries,procedure and triggers Good to have skills from Data tools perspective :Pyspark,Athena,Databricks, AWS Redshift technologies to analyse and bring data into Data Lake.Knowledge of building reports and power Bl Good knowledge of business processes, preferably knowledge of related modules and strong cross modular skills incl. interfaces. Expert application and customizing knowledge for used standard software and other regional solutions in the assigned module. . Ability to absorb sophisticated technical information and communicate effectively to both technical and business audiences. Knowledge of applicable data privacy practices and laws.
Posted 3 weeks ago
7.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Ahmedabad, Delhi / NCR
Hybrid
Lead Data Engineer (Databricks) Experience: 7-10 Years Exp Salary: Competitive Preferred Notice Period : Within 30 Days Opportunity Type: Hybrid (Ahmedabad) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills: Databricks, SQL OR Python, ETL tools OR Data Modelling OR Data Warehousing Inferenz (One of Uplers' Clients) is looking for: About Inferenz: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. Our technology expertise has helped us delivering the innovative solutions in key industries such as Healthcare & Life Sciences, Consumer & Retail, Financial Services and Emerging industries. Our main capabilities and solutions: Data Strategy & Architecture Data & Cloud Migration Data Quality & Governance Data Engineering Predictive Analytics Machine Learning/Artificial Intelligence Generative AI Specialties: Data and Cloud Strategy, Data Modernization, On-Premise to Cloud Migration, SQL to Snowflake Migration, Hadoop to Snowflake Migration, Cloud Data Platform and Warehouses, Data Engineering and Pipeline, Data Virtualization, Business Intelligence, Data Democratization, Marketing Analytics, Attribution Modelling, Machine Learning, Computer Vision, Natural Language Processing and Augmented Reality. Job Description Key Responsibilities: Lead the design, development, and optimization of data solutions using Databricks, ensuring they are scalable, efficient, and secure. Collaborate with cross-functional teams to gather and analyse data requirements, translating them into robust data architectures and solutions. Develop and maintain ETL pipelines, leveraging Databricks and integrating with Azure Data Factory as needed. Implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensure data quality, governance, and security practices are adhered to, maintaining the integrity and reliability of data solutions. Provide technical leadership and mentorship to junior engineers, fostering an environment of learning and growth. Stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. Required Skills & Qualifications: Bachelor's or master's degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a focus on Databricks. Proven expertise in building and optimizing data solutions using Databricks and integrating with Azure Data Factory/AWS Glue. Proficiency in SQL and programming languages such as Python or Scala. Strong understanding of data modelling, ETL processes, and Data Warehousing/Data Lakehouse concepts. Familiarity with cloud platforms, particularly Azure, and containerization technologies such as Docker. Excellent analytical, problem-solving, and communication skills. Demonstrated leadership ability with experience mentoring and guiding junior team members. Preferred Qualifications: Experience with Generative AI technologies and their applications. Familiarity with other cloud platforms, such as AWS or GCP. Knowledge of data governance frameworks and tools How to apply for this opportunity: Easy 3 Step Process: 1.Click On Apply and register or log in to our portal 2.Upload updated Resume & complete the Screening Form 3. Increase your chances of getting shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 3 weeks ago
4.0 - 8.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project description We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Responsibilities We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Skills Must have Experience5+ years of industry experience as a Data Scientist, with a proven track record of delivering impactful, data-driven solutions. Programming Skills: Advanced proficiency in Python, with extensive experience writing clean, efficient, and maintainable code. Proficiency with version control tools such as Git. Data EngineeringStrong working proficiency with SQL and distributed computing with Apache Spark. Cloud PlatformsExperience building and deploying apps on Azure Cloud. Generative AI & LLMsPractical experience with large language models (e.g., OpenAI, Anthropic, HuggingFace). Knowledge of Retrieval-Augmented Generation (RAG) techniques and prompt engineering is expected. Machine Learning & ModelingStrong grasp of statistical modeling, machine learning algorithms, and tools like scikit-learn, XGBoost, etc. Stakeholder EngagementExcellent communication skills with a demonstrated ability to interact with business stakeholders, understand their needs, present technical insights clearly, and drive alignment across teams. Tools and librariesProficiency with libraries like Pandas, NumPy, and ML lifecycle tools such as MLflow. Team CollaborationProven experience contributing to agile teams and working cross-functionally in fast-paced environments. Nice to have Hands-on experience with Databricks and Snowflake. Hands-on experience building LLM-based applications using agentic frameworks like LangChain, LangGraph, and AutoGen. Familiarity with data visualization platforms such as Power BI, Tableau, or Plotly. Front-end/Full stack development experience. Exposure to MLOps practices and model deployment pipelines in production. OtherLanguagesEnglishC2 Proficient SeniorityRegular
Posted 3 weeks ago
4.0 - 8.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Project description Luxoft has been asked to contract a Developer in support of a number of customer initiatives. The primary objective is to develop based on client requirements in the Telecom/network work environment Responsibilities A Data Engineer with experience in the following techologies: Databricks and Azure Apache Spark-based, hands on Python, SQL, Apache Airflow. Databricks clusters for ETL processes. Integration with ADLS, Blob Storage. Efficiently ingest data from various sources, including on-premises databases, cloud storage, APIs, and streaming data. Use Azure Key Vault for managing secrets. Hands on experience working with API's Kafka/Azure EventHub streaming hands on experience Hands on experience with data bricks delta API's and UC catalog Hands on experience working with version control tools Github Data Analytics Supports various ML frameworks. Integration with Databricks for model training. OnPrem Exposure on Linux based systems Unix scripting Skills Must have Python, Apache Airflow, Microsoft Azure and Databricks, SQL, databricks clusters for ETL, ADLS, Blob storage, ingestion from various sources including databases and cloud storage, APIs and streaming data, Kafka/Azure EventHub, databricks delta APIs and UC catalog. EducationTypically, a Bachelor's degree in Computer Science (preferably M.Sc. in Computer Science), Software Engineering, or a related field is required. Experience7+ years of experience in development or related fields. Problem-Solving Skills: Ability to troubleshoot and resolve issues related to application development and deployment. Communication Skills: Ability to effectively communicate technical concepts to team members and stakeholders. This includes written and verbal communication. TeamworkAbility to work effectively in teams with diverse individuals and skill sets. Continuous LearningGiven the rapidly evolving nature of web technologies, a commitment to learning and adapting to new technologies and methodologies is crucial. Nice to have Snowflake, PostGre, Redis exposure GenAI exposure Good understanding of RBAC OtherLanguagesEnglishC2 Proficient SenioritySenior
Posted 3 weeks ago
3.0 - 6.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project description A DevOps Support Engineer will perform tasks related to data pipeline work and monitor and support related to job execution, data movement, and on-call support. In addition, deployed pipeline implementations will be tested for production validation. Responsibilities Provide production support for 1st tier, after hours and on call support. The candidate will eventually develop into more data engineering within the Network Operations team. The selected resource will learn the Telecommunications domain while also developing data learning skills. Skills Must have ETL pipeline, data engineering, data movement/monitoring Azure Databricks Watchtower Automation tools Testing Nice to have Data Engineering Other Languages EnglishC2 Proficient Seniority Regular
Posted 3 weeks ago
5.0 - 9.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Clients in the Insurance Segment we are searching for a Senior Data Scientist with Databricks and Predictive Analytics Focus. Responsibilities Design and deploy predictive models (e.g., forecasting, churn analysis, fraud detection) using Python/SQL, Spark MLlib, and Databricks ML Build end-to-end ML pipelines (data ingestion feature engineering model training deployment) on Databricks Lakehouse Optimize model performance via hyperparameter tuning, AutoML, and MLflow tracking Collaborate with engineering teams to operationalize models (batch/real-time) using Databricks Jobs or REST APIs Implement Delta Lake for scalable, ACID-compliant data workflows. Enable CI/CD for ML pipelines using Databricks Repos and GitHub Actions Troubleshoot issues in Spark Jobs and Databricks Environment. Client is in the USA. Candidate should be able to work until 11.00 am EST to overlap a few hours with the client and be able to attend meetings. Skills Must have : 5+ years in predictive analytics, with expertise in regression, classification, time-series modeling Hands-on experience with Databricks Runtime for ML, Spark SQL, and PySpark Familiarity with MLflow, Feature Store, and Unity Catalog for governance. Industry experience in Life Insurance or P&C. Skills: Python, PySpark , MLflow , Databricks AutoML Predictive Modelling ( Classification , Clustering , Regression , timeseries and NLP) Cloud platform (Azure/AWS) , Delta Lake , Unity Catalog Nice to have Certifications: Databricks Certified ML Practitioner OtherLanguagesEnglishC1 Advanced SenioritySenior
Posted 3 weeks ago
6.0 - 11.0 years
8 - 14 Lacs
Pune
Work from Office
Responsibilities: designing, developing, and maintaining scalable data pipelines using Databricks, PySpark, Spark SQL, and Delta Live Tables. Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and pipelines. Implement best practices for data engineering, including data quality, and data security. Optimize and troubleshoot complex data workflows to ensure high performance and reliability. Develop and maintain documentation for data engineering processes and solutions. Requirements: Bachelor's or Master's degree. Proven experience as a Data Engineer, with a focus on Databricks, PySpark, Spark SQL, and Delta Live Tables. Strong understanding of data warehousing concepts, ETL processes, and data modelling. Proficiency in programming languages such as Python and SQL. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong leadership and communication skills, with the ability to mentor and guide team members.
Posted 3 weeks ago
6.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are seeking a skilled and experienced Cognos and Informatica Administrator to join our team. You will be responsible for the installation, configuration, maintenance, and support of Cognos and Informatica software in our organization. Your role will involve collaborating with cross-functional teams, solve system issues, and ensuring the smooth functioning of the Cognos and Informatica environments. Role Scope Deliverables: Responsibilities: Install, configure, and upgrade Cognos and Informatica application components, including servers, clients, and related tools. Monitor and maintain the performance, availability, and security of Cognos and Informatica environments. Collaborate with developers, business analysts, and other stakeholders to understand requirements and provide technical guidance. Troubleshoot and resolve issues related to Cognos and Informatica applications, databases, servers, and integrations. Perform system backups, disaster recovery planning, and implementation. Implement and enforce best practices for Cognos and Informatica administration, security, and performance tuning. Manage user access, roles, and permissions within Cognos and Informatica environments. Coordinate with vendors for product support, patches, upgrades, and license management. Stay up to date with the latest trends and advancements in Cognos and Informatica technologies. Document technical processes, procedures, and configurations. Nice-to-Have Skills: Development Skills: Familiarity with Cognos Report Studio, Framework Manager, Informatica PowerCenter, and other development tools to assist in troubleshooting and providing guidance to developers and users. Databricks Experience: Practiced in designing and building dashboards in Power BI or Power BI Administration experience Microsoft SQL Server Analysis Services (SSAS) Experience: Install, configure, and maintain Microsoft SQL Server Analysis Services (SSAS) environments. Proven knowledge as a Microsoft SQL Server Analysis Services (SSAS) Administrator Databricks Experience: Knowledge with Databricks, and a strong understanding of its architecture, capabilities, and best practices. Key Skills: Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Cognos and Informatica Administrator or similar role. Solid understanding of Cognos and Informatica installation, configuration, and administration. Familiarity with relational databases, SQL, and data warehousing concepts. Excellent troubleshooting and problem-solving skills. Ability to work independently and collaboratively in a team environment. Strong communication and interpersonal skills. Attention to detail and ability to prioritize tasks effectively.
Posted 3 weeks ago
6.0 - 10.0 years
1 - 2 Lacs
Hyderabad
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title: Senior Data Engineer Experience: 6 to 10 years Key Responsibilities : Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. Actively migrate use cases from our on premises Data Lake to Databricks on GCP. Collaborate with Product Management and business partners to understand use case requirements and reporting. Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . Document and showcase feature designs/workflows. Participate in team meetings and discussions around product development. Stay up to date on industry latest industry trends and design patterns. Technical Skills: Bachelor's in Computer Science, Computer Engineering or related field 4+ yrs. Development experience with Spark(PySpark), Python and SQL. Extensive knowledge building data pipelines Hands on experience with Databricks Devlopment Strong experience with Strong experience developing on Linux OS. Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Solid understanding of distributed system data structures, design principles. Comfortable communicating with teams via showcases/demos. Agile Development Methodologies (e.g. SAFe Kanban, Scrum). Notice period : immediate. Location: Hyderabad Mode of Work : WFO(Work From Office) Thanks & Regards, Narmadha S Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,INDIA. Contact Number:8067432451 Narmadha.s@blackwhite.in |www.blackwhite.in
Posted 3 weeks ago
5.0 - 10.0 years
15 - 19 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
S&C Global Network - AI - Prompt Engineering - DS Insurance - Consultant Entity :- Accenture Strategy & Consulting Team :- Global Network Data & AI Practice :- Insurance Analytics Title :- Ind & Func AI Decision Science Consultant Job location :- Bangalore/Gurgaon/Mumbai/Hyderabad/Pune/Chennai About S&C - Global Network :- Accenture Global Network - Data & AI practice help our clients grow their business in entirely new ways.Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. S&C - GN - Insurance Data & AI Practice helps our clients grow their business in entirely new ways. From strategy to execution, Accenture works with Property & Casualty insurers, Life & Retirement insurers, reinsurers and brokers across the value chain from Underwriting to Claims to Servicing and Enterprise Functions to develop analytic capabilities from accessing and reporting on data to predictive modelling to Generative AI that outperform the competition. We offer deep technical expertise in AI/ML tools, techniques & methods, along with strong strategy & consulting acumen and insurance domain knowledge. Our unique assets & accelerators coupled with diverse insurance insights and capabilities help us bring exceptional value to our clients. WHATS IN IT FOR YOU Accenture Global Network is a unified powerhouse that combines the capabilities of Strategy & Consulting with the force multipliers of Data and Artificial Intelligence. It is central to Accenture's future growth and Accenture is deeply invested in providing individuals with continuous opportunities for learning and growth. What you would do in this role Design, create, validate and refine prompts for Large Language Models (LLMs), for different client problems Employ techniques to guide and enhance model responses Develop effective AI interactions through proficient programming and utilization of playgrounds Utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production-ready quality Interface with clients/account team to understand engineering/business problems and translate it into analytics problems that shall deliver insights for action and operational improvements Consume data from multiple sources and present relevant information in a crisp and digestible manner that delivers valuable insights to both technical and non-technical audiences Mentor junior prompt engineers in both technical and softer aspects of the role Qualification Who we are looking for 5+ years experience in data-driven techniques including exploratory data analysis and data pre-processing, machine learning to solve business problems Bachelor's/Masters degree in Mathematics, Statistics, Economics, Computer Science, or related field Solid foundation in Statistical Modeling, Machine Learning algorithms, GenAI, LLMs, RAG architecture and Lang chain frameworks Proficiency in programming languages such as Python, PySpark, SQL or Scala Strong communication and presentation skills to effectively convey complex data insights and recommendations to clients and stakeholders In-depth knowledge and hands-on experience with Azure, AWS or Databricks tools. Relevant certifications in Azure are highly desirable Prior Insurance industry experience is preferred
Posted 3 weeks ago
6.0 - 11.0 years
5 - 9 Lacs
Hyderabad
Work from Office
6+ years of experience in Data engineering projects using COSMOS DB- Azure Databricks (Min 3-5 projects) Strong expertise in building data engineering solutions using Azure Databricks, Cosmos DB Strong T-SQL programming skills or with any other flavor of SQL Experience working with high volume data, large objects, complex data transformations Experience working in DevOps environments integrated with GIT for version control and CI/CD pipeline. Good understanding of data modelling for data warehouse and data marts Strong verbal and written communication skills Ability to learn, contribute and grow in a fast phased environment Nice to have: Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, ADLS Gen2, Azure Events Hub Experience using Jira and ServiceNow in project environments Experience in implementing Datawarehouse and ETL solutions
Posted 3 weeks ago
8.0 - 12.0 years
55 - 75 Lacs
Pune
Work from Office
About Position: We are looking for Python Developer to support the development, The candidate selected for this position will work in the IT Derivatives/Fixed Income team, focusing on the design and implementation of Python solutions for Front Office applications. These applications will utilize vendor software, including PyXLL, QuestDB, ArconTech, and Databricks. Role: Python Developer Location: AII PSL Location Experience: 8+Years Job Type: Full Time Employment What You'll Do: Design, build, and configure Python applications according to business process and application requirements. This involves working closely with stakeholders to gather and analyze requirements. The candidate will be expected to ensure the code is efficient, maintainable, and adheres to best practices. Collaborate with multiple business teams, such as trading and quantitative analysts, to understand their workflows, challenges, and requirements.Regular meetings, feedback sessions, and iterative development cycles will be part of this collaborative effort. Provide work estimates as required, including time, resources, and potential risks.The candidate will be responsible for communicating these estimates to project managers and adjusting them as necessary throughout the developmenhint lifecycle. Develop coding architecture for new applications and features. This task includes selecting appropriate design patterns, writing modular and reusable code, and documenting the architecture for future reference. Maintain and enhance code quality using static and dynamic code analyses, security vulnerability scans, code coverage, and CI/CD pipeline gating. Regular code reviews, automated testing, and continuous integration practices will be integral to maintaining high code quality standards. Participate in all phases of the software delivery lifecycle, from analysis through support. This means engaging in initial requirement gathering and analysis, contributing to design and development, performing testing and quality assurance, and providing post-deployment support. Expertise You'll Bring: Bachelors degree in computer science, Information Systems, or related field. 8+ years of experience in Python Development. Experience with Python environment installation in Windows and Linux Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Let's unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."
Posted 3 weeks ago
8.0 - 12.0 years
20 - 30 Lacs
Pune
Remote
8+ years of hands-on experience in software development, with 3+ years on data engineering practices and tooling • 3+ years working with AWS managed services and cloud-native development • At least 2 years working with Spark, python, SQL using technologies in modern data management and orchestration tooling (e.g., AirFlow, DataBricks, DBT) • Professional experience with data structures, relational databases, non-relational/no-SQL databases, ETL processes, and complex relational queries • Experience developing SaaS (Software as a Service) / product development • Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) • Exposure to API development and productization / data delivery at scale through APIs • Exceptional problem-solving and analytical skills • Excellent communication and teamwork abilities • Bachelor's degree in Computer Science, Software Engineering, or a related field (or equivalent practical experience)
Posted 3 weeks ago
4.0 - 7.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Job Summary We are seeking a skilled and detail-oriented Azure Data Engineer to join our data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and solutions on the Microsoft Azure cloud platform. You will collaborate with data analysts, reporting team, and business stakeholders to ensure efficient data availability, quality, and governance. Must have skills: Strong hands on experience with Azure Data Factory , Azure Data Lake Storage , and Azure SQL . Good to have skills: Working knowledge on Databricks, Azure Synapse Analytics, Azure functions, Logic app workflows, Log analytics and Azure DevOps. Roles and Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Azure SQL , Databricks , and other Azure services. Develop and maintain data lakes and data warehouses on Azure. Integrate data from various on-premises and cloud-based sources. Create and manage ETL/ELT processes , ensuring data accuracy and performance. Optimize and troubleshoot data pipelines and workflows. Ensure data security, compliance, and governance. Collaborate with business stakeholders to define data requirements and deliver actionable insights. Monitor and maintain Azure data services performance and cost-efficiency. Design, develop, and maintain SQL Server databases and ETL processes. Write complex SQL queries, stored procedures, functions, and triggers to support application development and data analysis Optimize database performance through indexing, partitioning, and other performance tuning techniques.
Posted 3 weeks ago
7.0 - 12.0 years
16 - 25 Lacs
Hyderabad
Work from Office
Role & responsibilities Job Title: Data Engineer Years of experience: 7 to 12 years (Minimum 5 years of relevant experience) Work Mode: Work From Office Hyderabad Notice Period-Immediate to 30 Days only Key Skills: Python, SQL, AWS, Spark, Databricks - ( Mandate) Airflow- Good to have
Posted 3 weeks ago
7.0 - 10.0 years
27 - 42 Lacs
Pune
Work from Office
Job Summary We are seeking a highly skilled Sr. Developer with 7 to 10 years of experience to join our dynamic team. The ideal candidate will have expertise in Python Databricks SQL Databricks Workflows and PySpark. This role operates in a hybrid work model with day shifts offering the opportunity to work on innovative projects that drive our companys success. Responsibilities Develop and maintain scalable data processing systems using Python and PySpark to enhance data analytics capabilities. Collaborate with cross-functional teams to design and implement Databricks Workflows that streamline data operations. Optimize Databricks SQL queries to improve data retrieval performance and ensure efficient data management. Provide technical expertise in Python programming to support the development of robust data solutions. Oversee the integration of data sources into Databricks environments to facilitate seamless data processing. Ensure data quality and integrity by implementing best practices in data validation and error handling. Troubleshoot and resolve complex technical issues related to Databricks and PySpark environments. Contribute to the continuous improvement of data processing frameworks and methodologies. Mentor junior developers and provide guidance on best practices in data engineering. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Conduct code reviews to ensure adherence to coding standards and best practices. Stay updated with the latest industry trends and technologies to drive innovation in data engineering. Document technical processes and workflows to support knowledge sharing and team collaboration. Qualifications Possess a strong proficiency in Python programming and its application in data engineering. Demonstrate expertise in Databricks SQL and its use in optimizing data queries. Have hands-on experience with Databricks Workflows for efficient data processing. Show proficiency in PySpark for developing scalable data solutions. Exhibit excellent problem-solving skills and the ability to troubleshoot complex technical issues. Have a solid understanding of data integration techniques and best practices. Display strong communication skills to collaborate effectively with cross-functional teams. Certifications Required Databricks Certified Data Engineer Associate Python Institute PCEP Certification
Posted 3 weeks ago
5.0 - 8.0 years
9 - 14 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience5-8 Years.
Posted 3 weeks ago
8.0 - 10.0 years
8 - 12 Lacs
Chennai
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: DataBricks - Data Engineering. Experience8-10 Years.
Posted 3 weeks ago
8.0 - 10.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: DataBricks - Data Engineering. Experience8-10 Years.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France