Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Role & responsibilities Data Engineer/ Designer with 4-6 years of experience • Databricks Certified Data Engineer Associate • Must have expert SQL Development experience (atleast 5 years) •Databricks Certified Data Engineer Associate •Must be able to perform Analysis Data Analysis •Must be able to perform SQL Script Analysis to understand the business logic •Must be able to perform Unix jobs analysis •Must have good hands-on experience on writing PL/SQL •Must have good hands-on experience on UNIX Shell scripting •Exposure to Control-M Scheduling. •Exposure to Testing and should be able to efficiently perform Testing. •Should have experience on Git and Jenkins. •Experience using GitHub, Atlassian - Jira/Confluence, Jenkins (or similar CI tools) •Banking Domain experience (3-5 years) •Experience deploying/managing Data and Data Warehouse solutions •Good to have : Data Engineering technologies (e.g. Spark, Hadoop, Kafka) •Data Warehousing (e.g. SQL, OLTP/OLAP/DSS) •Understanding of Solution Development life cycles Collaborative & Persuasive, Self-motivated, Research Oriented, Hands-on, Committed, Always-on learner High performing and diverse team with an unrivalled culture of innovation
Posted 10 hours ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
#Employment Type: Contract Skills Azure Data Factory SQL Azure Blob Azure Logic Apps
Posted 10 hours ago
6.0 - 8.0 years
10 - 14 Lacs
Mumbai, Hyderabad, Bengaluru
Work from Office
Job type Contract to HIRE Co-ordinate with CoreTech for Linux Server Procurement Co-ordinate with CoreTech to sunset Linux Server Co-ordinate with CoreTech to setup DNS & Load Balancer Application Server Setup & Maintainance File transfer protocol setup & support Disk Space Management SSL Certificate Setup/Renewals & Maintainance Log Rotation Monthly Patching Support Vulnerability Management Troubleshooting of DeloitteOmnia Agent(NO_CONNECTIVITY),GE Infaagents PostgreSQL DB and ODBC connection Issue and ISQL DB connectivity issue for Databricks unixODBC driver on DeloitteOmnia Server. Databricks unixODBC driver installation & Configuration for DeloitteOmnia Server. Maintance and support of DeloitteOmnia Infrastructure. 1.Disk Space Management SSL Certificate Setup/Renewals & Maintainance Log Rotation Monthly Patching Support Vulnerability Management DB Migration support for applications. Application Jobs Schedulings and Jar Deployment Support We will help the app team in migrating their application jobs. Automation Support for Couple of use cases. App/Web services restarts. Production Release Deployment Support. Access (Jenkins,Github Repo). Monitoring: New Relic Synthetic, Infrastructure & Page View Setup & Maintenance. NR Password Updations for FSSO for multiple monitors. Daily Monitoring of NR alerts. Working on NR monitor issues. Maintain NR Insight & Ops Dashboard.
Posted 10 hours ago
6.0 - 11.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Experience in Cloud platform, e.g., AWS, GCP, Azure, etc. Experience in distributed technology tools, viz. SQL, Spark, Python, PySpark, Scala Performance Turing Optimize SQL, PySpark for performance Airflow workflow scheduling tool for creating data pipelines GitHub source control tool & experience with creating/ configuring Jenkins pipeline Experience in EMR/ EC2, Databricks etc. DWH tools incl. SQL database, Presto, and Snowflake Streaming, Serverless Architecture
Posted 10 hours ago
10.0 - 15.0 years
17 - 22 Lacs
Hyderabad, Pune
Work from Office
Oracle HCM Security Lead (Strategy)1 What You''ll Do Lead the transition to RBAC across Oracle HCM (Core HR, Payroll, Absence, Time, Talent) and downstream systems with complex integrations. Architect an end-to-end access governance framework , covering application, integration, and data warehouse layers including Databricks, OAC/OTBI, and 3rd-party data hubs. Define and standardize personas, access tiers, and Areas of Responsibility (AOR) with business process owners. Partner with data platform and analytics teams to align access policies across structured/unstructured data sources used for reporting, workforce intelligence, and ML. Integrate security policies with Okta and identity management tools , ensuring consistent enforcement across apps and data endpoints. Enable secure self-service analytics by implementing column- and row-level security within platforms like OTBI and Databricks, ensuring compliance with SOX, GDPR, and HIPAA. Manage security lifecycle for Oracle HCM and connected platformsprovisioning, auditing, change control, and SoD enforcement. Serve as the employee & candidate data access security authority , participating in solution design, release planning, and cross-functional governance reviews, consulting with legal, HRBPs, comms, and engineering security where applicable Basic Qualifications 10+ years of experience in enterprise security, application governance, or architecture roles with deep expertise in Oracle Fusion HCM and SaaS integration landscapes. Proven experience designing and implementing enterprise RBAC frameworks , with hands-on involvement across apps and data layers. Deep understanding of big data platforms (Databricks, Snowflake, etc.) and how access, classification, and lineage apply in modern data environments. Experience with analytics platform security including OTBI, OAC, and integration with business intelligence tools. Familiarity with identity federation and access policy integration via Okta, Azure AD, or similar tools. Strong understanding of compliance frameworks (SOX, GDPR, HIPAA) and ability to translate policies into technical access controls. Skilled communicator, capable of aligning technical security strategy with business priorities and presenting to senior leadership. Preferred Qualifications Experience with multi-phase Oracle HCM deployments or Workday-to-Oracle transitions. Exposure to data mesh or federated data ownership models . Background in data pipeline security and governance , especially in Databricks, Spark, or similar platforms. Strong knowledge of RACI, persona-based design , and data domain ownership strategies in global organizations. Demonstrated ability to build security into the SDLC , with tools and controls supporting agile SaaS environments.
Posted 10 hours ago
8.0 - 13.0 years
8 - 12 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 S&P Global Market Intelligence The Role: Lead Data Engineer, Application Development The Team a collaborative team of database professionals responsible for building and maintaining data products that powers our clients. The Impact Designing, implementing, and maintaining database systems for Databricks and SQL Server Whats in it for you You'll have the opportunity to work with the latest technologies, learn from experienced professionals, and contribute to the success of high-impact projects Responsibilities Designing, developing, and implementing database systems, including database schemas, stored procedures, and other database objects. Monitoring database performance and optimizing queries to enhance efficiency. Implementing performance tuning strategies and techniques. Documenting database schemas, configurations, and procedures Providing support to users and stakeholders on database-related issues. What Were Looking For Bachelor/Masters Degree in Computer Science, Information Systems or equivalent Minimum 8+ years of strong database development experience Advance SQL programming skills, relational and dimension data modeling Understanding of database performance tuning in large datasets Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Experience in conducting application design and code reviews Proficiency with one or more of the following technologiesObject-oriented programming, Programing Languages ( Java, Scala , Python, C#), Scripting (Bash, Powershell) Extensive knowledge of Database systems (Databricks, SQL Server, Oracle, Snowflake) Experience working in cloud computing environments such as AWS , GCP & Azure Exposure to Orchestration technologies like Airflow & ETL Experience with large scale messaging systems such as Kafka Knowledge of Fundamentals, or financial industry highly preferred About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 11 hours ago
7.0 - 12.0 years
6 - 10 Lacs
Noida, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 10 Responsibilities: To work closely with various stakeholders to collect, clean, model and visualise datasets. To create data driven insights by researching, designing and implementing ML models to deliver insights and implement action-oriented solutions to complex business problems To drive ground-breaking ML technology within the Modelling and Data Science team. To extract hidden value insights and enrich accuracy of the datasets. To leverage technology and automate workflows creating modernized operational processes aligning with the team strategy. To understand, implement, manage, and maintain analytical solutions & techniques independently. To collaborate and coordinate with Data, content and modelling teams and provide analytical assistance of various commodity datasets To drive and maintain high quality processes and delivering projects in collaborative Agile team environments. : 7+ years of programming experience particularly in Python 4+ years of experience working with SQL or NoSQL databases. 1+ years of experience working with Pyspark. University degree in Computer Science, Engineering, Mathematics, or related disciplines. Strong understanding of big data technologies such as Hadoop, Spark, or Kafka. Demonstrated ability to design and implement end-to-end scalable and performant data pipelines. Experience with workflow management platforms like Airflow. Strong analytical and problem-solving skills. Ability to collaborate and communicate effectively with both technical and non-technical stakeholders. Experience building solutions and working in the Agile working environment Experience working with git or other source control tools Strong understanding of Object-Oriented Programming (OOP) principles and design patterns. Knowledge of clean code practices and the ability to write well-documented, modular, and reusable code. Strong focus on performance optimization and writing efficient, scalable code. Nice to have: Experience working with Oil, gas and energy markets Experience working with BI Visualization applications (e.g. Tableau, Power BI) Understanding of cloud-based services, preferably AWS Experience working with Unified analytics platforms like Databricks Experience with deep learning and related toolkitsTensorflow, PyTorch, Keras, etc. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today.For more information, visit http://www.spglobal.com/commodity-insights . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Location - Bengaluru,Noida,Uttarpradesh,Hyderabad
Posted 11 hours ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
4+ years of hands on experience using Azure Cloud, ADLS, ADF & Databricks Finance Domain Data Stewardship Finance Data Reconciliation with SAP down-stream systems Run/Monitor Pipelines/ Validate the Data Bricks note books Able to interface with onsite/ business stake holders. Python, SQL Hands on Knowledge of Snowflake/DW is desirable.
Posted 11 hours ago
6.0 - 8.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Immediate Job Openings on #Big Data Engineer _ Pan India_ Contract Experience: 6 +Years Skill:Big Data Engineer Location: Pan India Notice Period:Immediate. Employment Type: Contract Pyspark Azure Data Bricks Experience on workflows Unity catalog Managed / external data with delta tables.
Posted 12 hours ago
12.0 - 17.0 years
17 - 22 Lacs
Noida
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice under RMI – Optum Advisory umbrella. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Design and implement secure, scalable, and cost-effective cloud data architectures using cloud services such as Azure Data Factory (ADF), Azure Databricks, Azure Storage, Key Vault, Snowflake, Synapse Analytics, MS Fabric/Power BI etc. Define and lead data & cloud strategy, including migration plans, modernization of legacy systems, and adoption of new cloud capabilities Collaborate with clients to understand business requirements and translate them into optimal cloud architecture solutions, balancing performance, security, and cost Evaluate and compare cloud services (e.g., Databricks, Snowflake, Synapse Analytics) and recommend the best-fit solutions based on project needs and organizational goals Lead the full lifecycle of data platform and product implementations, from planning and design to deployment and support Drive cloud migration initiatives, ensuring smooth transition from on-premise systems while engaging and upskilling existing teams Lead and mentor a team of cloud and data engineers, fostering a culture of continuous learning and technical excellence Plan and guide the team in building Proof of Concepts (POCs), exploring new cloud capabilities, and validating emerging technologies Establish and maintain comprehensive documentation for cloud setup processes, architecture decisions, and operational procedures Work closely with internal and external stakeholders to gather requirements, present solutions, and ensure alignment with business objectives Ensure all cloud solutions adhere to security best practices, compliance standards, and governance policies Prepare case studies and share learnings from implementations to build organizational knowledge and improve future projects Building and analyzing data engineering processes and act as an SME to troubleshoot performance issues and suggesting solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions, Maven etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Identifies solutions to non-standard requests and problems Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 12+ years of overall experience in Data & Analytics engineering 10+ years of solid experience working as an Architect designing data platforms using Azure, Databricks, Snowflake, ADF, Data Lake, Synapse Analytics, Power BI etc. 10+ years of experience working with data platform or product using PySpark and Spark-SQL In-depth experience designing complex Azure architecture for various business needs & ability to come up with efficient design & solutions Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. Experience in leading team and people management Highly proficient and hands-on experience with Azure services, Databricks/Snowflake development etc. Excellent communication and stakeholder management skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Eexperience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 12 hours ago
7.0 - 12.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 7+ years of overall experience in Data & Analytics engineering 5+ years of experience working with Azure, Databricks, and ADF, Data Lake 5+ years of experience working with data platform or product using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 12 hours ago
3.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries ofscience and engineering to make possiblethe next generations of technology, join us to Make Possible a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers.We empower our team to push the boundaries of what is possiblewhile learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities: Supports the design and development of program methods, processes, and systems to consolidate and analyze structured and unstructured, diverse "big data" sources. Interfaces with internal customers for requirements analysis and compiles data for scheduled or special reports and analysis Supports project teams to develop analytical models, algorithms and automated processes, applying SQL understanding and Python programming, to cleanse, integrate and evaluate large datasets. Supports the timely development of products for manufacturing and process information by applying sophisticated data analytics. Able to quickly understand the requirement and create it into executive level presentation slides. Participates in the design, development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. used to drive key business decisions. Strong business & financial (P&L) acumen. Able to understand key themes, financial terms and data points to create appropriate summaries. Works with business intelligence manager and other staff to assess various reporting needs. Analyzes reporting needs and requirements, assesses current reporting in the context of strategic goals and devise plans for delivering the most appropriate reporting solutions to users. Qualification: Bachelors/Masters degree or relevant 7 - 12 years of experience as data analyst Required technical skills in SQL, Azure, Python, Databricks, Tableau (good to have) PowerPoint and Excel expertise Experience in Supply Chain domain. Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and basic knowledge of related disciplines. Business Expertise Has knowledge of best practices and how own area integrated with others; is aware of the competition and the factors that differentiate them in the market. Leadership Acts as a resource for colleagues with less experience; may lead small projects with manageable risks and resource requirements. Problem Solving Solves complex problems; takes a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information. Impact Impacts a range of customer, operational, project or service activities within own team and other related teams; works within broad guidelines and policies. Interpersonal Skills Explains difficult or sensitive information; works to build consensus. Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 20% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 12 hours ago
8.0 - 13.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries ofscience and engineering to make possiblethe next generations of technology, join us to Make Possible a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers.We empower our team to push the boundaries of what is possiblewhile learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities Provide technical support for applications built using .Net as well as Angular, React and other open source technologies. Troubleshoot and resolve issues related to Front End, APIs and backend services. Collaborate with development teams to understand and resolve technical issues Assist in the deployment and maintenance of software applications. Ensure the performance, quality, and responsiveness of applications and apply permanent fixes to the critical and recurring issues Help maintain code quality, organization, and automation. Perform design reviews with the respective development for critical applications and provide inputs Document support processes and solutions for future reference. Stay up-to-date with the latest industry trends and technologies. Required Skills and Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 8+ years of experience in software development and support. Strong proficiency in .Net, Angular, React, Proficient in Python for backend support Familiarity in Hadoop Ecosystem as well as Databricks Experience with RESTful APIs and web services. Solid understanding of front-end technologies, including HTML5, CSS3, and JavaScript as well as Azure, AWS Strong Background in SQL Server and other relational databases Familiarity with version control systems (e.g., Git) as well as Atlassian Products for Software Development and Code Deployment Mechanisms/DevOps Best practices in hosting the applications in containerized platforms like OCP (onprem and cloud) etc Experience with open-source projects and contributions. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Certifications in relevant areas specially Microsoft will be a plus Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and knowledge of Semiconductor industry is nice to have interpersonal Skills Explains difficult or sensitive information; works to build consensus Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 12 hours ago
6.0 - 11.0 years
8 - 14 Lacs
Hyderabad
Work from Office
Bachelors degree (Computer Science), masters degree or Technical Diploma or equivalent At least 8 years of experience in a similar role At least 5 years of experience on AWS and/or Azure At least 5 years of experience on Databricks At least 5 years of experience on multiples Azure and AWS PaaS solutions: Azure Data Factory, MSSQL, Azure storage, AWS S3, Cognitive search, CosmosDB, Event Hub, AWS glue Strong knowledge of AWS and Azure architecture design best practices Knowledge of ITIL & AGILE methodologies (certifications are a plus) Experience working with DevOps tools such as Git, CI/CD pipelines, Ansible, Azure DevOps Knowledge of Airflow, Kubernetes is an added advantage Solid understanding of Networking/Security and Linux English language on the Business Fluent level is required Curious to continuously learn and explore new approaches/technologies Able to work under pressure in a multi-vendor and multi-cultural team Flexible, agile and adaptive to change Customer-focused approach Good communication skills Analytical mind-set Innovation
Posted 12 hours ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Data Engineer ensuring the smooth functioning of our applications and data systems. Your expertise in Data Ingestion, Release Management, Monitorization, Incident Review, Databricks, Azure Cloud, and Data Analysis will be instrumental in maintaining the reliability, availability, and performance of our applications and data pipelines. You will collaborate closely with cross-functional teams to support application deployments, monitor system health, analyze data, and provide timely resolutions to incidents. The ideal candidate should have a strong background in Azure DevOps, Azure Cloud specially ADF, Databricks, and AWS Cloud. List of Key Responsibilities: Implement and manage data ingestion processes to acquire data from various sources and ensure its accuracy and completeness in our systems. Collaborate with development and operations teams to facilitate the release management process, ensuring successful and efficient deployment of application updates and enhancements. Monitor the performance and health of applications and data pipelines, promptly identifying and addressing any anomalies or potential issues. Respond to incidents and service requests in a timely manner, conducting thorough incident reviews to identify root causes and implementing effective solutions to prevent recurrence. Utilize Databricks and monitoring tools to analyze application logs, system metrics, and data to diagnose and troubleshoot issues effectively. Analyze data-related issues, troubleshoot data quality problems, and propose solutions to optimize data workflows. Utilize Azure Cloud services to deploy and manage applications and data infrastructure efficiently. Document incident reports, resolutions, and support procedures for knowledge sharing and future reference. Continuously improve support processes and workflows to enhance efficiency, minimize downtime, and improve the overall reliability of applications and data systems. Stay up-to-date with the latest technologies and industry best practices related to application support, data analysis, and cloud services. Technical Knowledge: Technology Level of expertise* Priority Must Nice to have Scala X Spark X Azure Cloud Senior yes X AWS Cloud X Python X Databricks Senior yes X ADF yes X Rstudio/Rconnect Junior X
Posted 12 hours ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Immediate Openings on DotNet Developer _ Bangalore _Contract Skill: DotNet Developer Notice Period: Immediate . Employment Type: Contract Job Description Bachelor's degree in computer science, Information Systems, or other relevant subject area or equivalent experience ' 8-10+ years of experience in the skills .net framework,.net core,asp.net, vb.net, html, web serv ce,web api,SharePoint,power automate,Microsoft apps,mysql,sqlserver Client and Server Architecture and maintain code base via GitHub would be added benefit Robost SQL knowledge such as complex nested queries,procedure and triggers Good to have skills from Data tools perspective :Pyspark,Athena,Databricks, AWS Redshift technologies to analyse and bring data into Data Lake.Knowledge of building reports and power Bl Good knowledge of business processes, preferably knowledge of related modules and strong cross modular skills incl. interfaces. Expert application and customizing knowledge for used standard software and other regional solutions in the assigned module. . Ability to absorb sophisticated technical information and communicate effectively to both technical and business audiences. Knowledge of applicable data privacy practices and laws.
Posted 12 hours ago
7.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Ahmedabad, Delhi / NCR
Hybrid
Lead Data Engineer (Databricks) Experience: 7-10 Years Exp Salary: Competitive Preferred Notice Period : Within 30 Days Opportunity Type: Hybrid (Ahmedabad) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills: Databricks, SQL OR Python, ETL tools OR Data Modelling OR Data Warehousing Inferenz (One of Uplers' Clients) is looking for: About Inferenz: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. Our technology expertise has helped us delivering the innovative solutions in key industries such as Healthcare & Life Sciences, Consumer & Retail, Financial Services and Emerging industries. Our main capabilities and solutions: Data Strategy & Architecture Data & Cloud Migration Data Quality & Governance Data Engineering Predictive Analytics Machine Learning/Artificial Intelligence Generative AI Specialties: Data and Cloud Strategy, Data Modernization, On-Premise to Cloud Migration, SQL to Snowflake Migration, Hadoop to Snowflake Migration, Cloud Data Platform and Warehouses, Data Engineering and Pipeline, Data Virtualization, Business Intelligence, Data Democratization, Marketing Analytics, Attribution Modelling, Machine Learning, Computer Vision, Natural Language Processing and Augmented Reality. Job Description Key Responsibilities: Lead the design, development, and optimization of data solutions using Databricks, ensuring they are scalable, efficient, and secure. Collaborate with cross-functional teams to gather and analyse data requirements, translating them into robust data architectures and solutions. Develop and maintain ETL pipelines, leveraging Databricks and integrating with Azure Data Factory as needed. Implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensure data quality, governance, and security practices are adhered to, maintaining the integrity and reliability of data solutions. Provide technical leadership and mentorship to junior engineers, fostering an environment of learning and growth. Stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. Required Skills & Qualifications: Bachelor's or master's degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a focus on Databricks. Proven expertise in building and optimizing data solutions using Databricks and integrating with Azure Data Factory/AWS Glue. Proficiency in SQL and programming languages such as Python or Scala. Strong understanding of data modelling, ETL processes, and Data Warehousing/Data Lakehouse concepts. Familiarity with cloud platforms, particularly Azure, and containerization technologies such as Docker. Excellent analytical, problem-solving, and communication skills. Demonstrated leadership ability with experience mentoring and guiding junior team members. Preferred Qualifications: Experience with Generative AI technologies and their applications. Familiarity with other cloud platforms, such as AWS or GCP. Knowledge of data governance frameworks and tools How to apply for this opportunity: Easy 3 Step Process: 1.Click On Apply and register or log in to our portal 2.Upload updated Resume & complete the Screening Form 3. Increase your chances of getting shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 13 hours ago
4.0 - 8.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project description We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Responsibilities We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Skills Must have Experience5+ years of industry experience as a Data Scientist, with a proven track record of delivering impactful, data-driven solutions. Programming Skills: Advanced proficiency in Python, with extensive experience writing clean, efficient, and maintainable code. Proficiency with version control tools such as Git. Data EngineeringStrong working proficiency with SQL and distributed computing with Apache Spark. Cloud PlatformsExperience building and deploying apps on Azure Cloud. Generative AI & LLMsPractical experience with large language models (e.g., OpenAI, Anthropic, HuggingFace). Knowledge of Retrieval-Augmented Generation (RAG) techniques and prompt engineering is expected. Machine Learning & ModelingStrong grasp of statistical modeling, machine learning algorithms, and tools like scikit-learn, XGBoost, etc. Stakeholder EngagementExcellent communication skills with a demonstrated ability to interact with business stakeholders, understand their needs, present technical insights clearly, and drive alignment across teams. Tools and librariesProficiency with libraries like Pandas, NumPy, and ML lifecycle tools such as MLflow. Team CollaborationProven experience contributing to agile teams and working cross-functionally in fast-paced environments. Nice to have Hands-on experience with Databricks and Snowflake. Hands-on experience building LLM-based applications using agentic frameworks like LangChain, LangGraph, and AutoGen. Familiarity with data visualization platforms such as Power BI, Tableau, or Plotly. Front-end/Full stack development experience. Exposure to MLOps practices and model deployment pipelines in production. OtherLanguagesEnglishC2 Proficient SeniorityRegular
Posted 13 hours ago
4.0 - 8.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Project description Luxoft has been asked to contract a Developer in support of a number of customer initiatives. The primary objective is to develop based on client requirements in the Telecom/network work environment Responsibilities A Data Engineer with experience in the following techologies: Databricks and Azure Apache Spark-based, hands on Python, SQL, Apache Airflow. Databricks clusters for ETL processes. Integration with ADLS, Blob Storage. Efficiently ingest data from various sources, including on-premises databases, cloud storage, APIs, and streaming data. Use Azure Key Vault for managing secrets. Hands on experience working with API's Kafka/Azure EventHub streaming hands on experience Hands on experience with data bricks delta API's and UC catalog Hands on experience working with version control tools Github Data Analytics Supports various ML frameworks. Integration with Databricks for model training. OnPrem Exposure on Linux based systems Unix scripting Skills Must have Python, Apache Airflow, Microsoft Azure and Databricks, SQL, databricks clusters for ETL, ADLS, Blob storage, ingestion from various sources including databases and cloud storage, APIs and streaming data, Kafka/Azure EventHub, databricks delta APIs and UC catalog. EducationTypically, a Bachelor's degree in Computer Science (preferably M.Sc. in Computer Science), Software Engineering, or a related field is required. Experience7+ years of experience in development or related fields. Problem-Solving Skills: Ability to troubleshoot and resolve issues related to application development and deployment. Communication Skills: Ability to effectively communicate technical concepts to team members and stakeholders. This includes written and verbal communication. TeamworkAbility to work effectively in teams with diverse individuals and skill sets. Continuous LearningGiven the rapidly evolving nature of web technologies, a commitment to learning and adapting to new technologies and methodologies is crucial. Nice to have Snowflake, PostGre, Redis exposure GenAI exposure Good understanding of RBAC OtherLanguagesEnglishC2 Proficient SenioritySenior
Posted 13 hours ago
3.0 - 6.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project description A DevOps Support Engineer will perform tasks related to data pipeline work and monitor and support related to job execution, data movement, and on-call support. In addition, deployed pipeline implementations will be tested for production validation. Responsibilities Provide production support for 1st tier, after hours and on call support. The candidate will eventually develop into more data engineering within the Network Operations team. The selected resource will learn the Telecommunications domain while also developing data learning skills. Skills Must have ETL pipeline, data engineering, data movement/monitoring Azure Databricks Watchtower Automation tools Testing Nice to have Data Engineering Other Languages EnglishC2 Proficient Seniority Regular
Posted 13 hours ago
5.0 - 9.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Project description Luxoft DXC Technology Company is an established company focusing on consulting and implementation of complex projects in the financial industry. At the interface between technology and business, we convince with our know-how, well-founded methodology and pleasure in success. As a reliable partner to our renowned customers, we support them in planning, designing and implementing the desired innovations. Together with the customer, we deliver top performance! For one of our Clients in the Insurance Segment we are searching for a Senior Data Scientist with Databricks and Predictive Analytics Focus. Responsibilities Design and deploy predictive models (e.g., forecasting, churn analysis, fraud detection) using Python/SQL, Spark MLlib, and Databricks ML Build end-to-end ML pipelines (data ingestion feature engineering model training deployment) on Databricks Lakehouse Optimize model performance via hyperparameter tuning, AutoML, and MLflow tracking Collaborate with engineering teams to operationalize models (batch/real-time) using Databricks Jobs or REST APIs Implement Delta Lake for scalable, ACID-compliant data workflows. Enable CI/CD for ML pipelines using Databricks Repos and GitHub Actions Troubleshoot issues in Spark Jobs and Databricks Environment. Client is in the USA. Candidate should be able to work until 11.00 am EST to overlap a few hours with the client and be able to attend meetings. Skills Must have : 5+ years in predictive analytics, with expertise in regression, classification, time-series modeling Hands-on experience with Databricks Runtime for ML, Spark SQL, and PySpark Familiarity with MLflow, Feature Store, and Unity Catalog for governance. Industry experience in Life Insurance or P&C. Skills: Python, PySpark , MLflow , Databricks AutoML Predictive Modelling ( Classification , Clustering , Regression , timeseries and NLP) Cloud platform (Azure/AWS) , Delta Lake , Unity Catalog Nice to have Certifications: Databricks Certified ML Practitioner OtherLanguagesEnglishC1 Advanced SenioritySenior
Posted 13 hours ago
6.0 - 11.0 years
8 - 14 Lacs
Pune
Work from Office
Responsibilities: designing, developing, and maintaining scalable data pipelines using Databricks, PySpark, Spark SQL, and Delta Live Tables. Collaborate with cross-functional teams to understand data requirements and translate them into efficient data models and pipelines. Implement best practices for data engineering, including data quality, and data security. Optimize and troubleshoot complex data workflows to ensure high performance and reliability. Develop and maintain documentation for data engineering processes and solutions. Requirements: Bachelor's or Master's degree. Proven experience as a Data Engineer, with a focus on Databricks, PySpark, Spark SQL, and Delta Live Tables. Strong understanding of data warehousing concepts, ETL processes, and data modelling. Proficiency in programming languages such as Python and SQL. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong leadership and communication skills, with the ability to mentor and guide team members.
Posted 14 hours ago
6.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
We are seeking a skilled and experienced Cognos and Informatica Administrator to join our team. You will be responsible for the installation, configuration, maintenance, and support of Cognos and Informatica software in our organization. Your role will involve collaborating with cross-functional teams, solve system issues, and ensuring the smooth functioning of the Cognos and Informatica environments. Role Scope Deliverables: Responsibilities: Install, configure, and upgrade Cognos and Informatica application components, including servers, clients, and related tools. Monitor and maintain the performance, availability, and security of Cognos and Informatica environments. Collaborate with developers, business analysts, and other stakeholders to understand requirements and provide technical guidance. Troubleshoot and resolve issues related to Cognos and Informatica applications, databases, servers, and integrations. Perform system backups, disaster recovery planning, and implementation. Implement and enforce best practices for Cognos and Informatica administration, security, and performance tuning. Manage user access, roles, and permissions within Cognos and Informatica environments. Coordinate with vendors for product support, patches, upgrades, and license management. Stay up to date with the latest trends and advancements in Cognos and Informatica technologies. Document technical processes, procedures, and configurations. Nice-to-Have Skills: Development Skills: Familiarity with Cognos Report Studio, Framework Manager, Informatica PowerCenter, and other development tools to assist in troubleshooting and providing guidance to developers and users. Databricks Experience: Practiced in designing and building dashboards in Power BI or Power BI Administration experience Microsoft SQL Server Analysis Services (SSAS) Experience: Install, configure, and maintain Microsoft SQL Server Analysis Services (SSAS) environments. Proven knowledge as a Microsoft SQL Server Analysis Services (SSAS) Administrator Databricks Experience: Knowledge with Databricks, and a strong understanding of its architecture, capabilities, and best practices. Key Skills: Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Cognos and Informatica Administrator or similar role. Solid understanding of Cognos and Informatica installation, configuration, and administration. Familiarity with relational databases, SQL, and data warehousing concepts. Excellent troubleshooting and problem-solving skills. Ability to work independently and collaboratively in a team environment. Strong communication and interpersonal skills. Attention to detail and ability to prioritize tasks effectively.
Posted 14 hours ago
5.0 - 10.0 years
15 - 19 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
S&C Global Network - AI - Prompt Engineering - DS Insurance - Consultant Entity :- Accenture Strategy & Consulting Team :- Global Network Data & AI Practice :- Insurance Analytics Title :- Ind & Func AI Decision Science Consultant Job location :- Bangalore/Gurgaon/Mumbai/Hyderabad/Pune/Chennai About S&C - Global Network :- Accenture Global Network - Data & AI practice help our clients grow their business in entirely new ways.Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. S&C - GN - Insurance Data & AI Practice helps our clients grow their business in entirely new ways. From strategy to execution, Accenture works with Property & Casualty insurers, Life & Retirement insurers, reinsurers and brokers across the value chain from Underwriting to Claims to Servicing and Enterprise Functions to develop analytic capabilities from accessing and reporting on data to predictive modelling to Generative AI that outperform the competition. We offer deep technical expertise in AI/ML tools, techniques & methods, along with strong strategy & consulting acumen and insurance domain knowledge. Our unique assets & accelerators coupled with diverse insurance insights and capabilities help us bring exceptional value to our clients. WHATS IN IT FOR YOU Accenture Global Network is a unified powerhouse that combines the capabilities of Strategy & Consulting with the force multipliers of Data and Artificial Intelligence. It is central to Accenture's future growth and Accenture is deeply invested in providing individuals with continuous opportunities for learning and growth. What you would do in this role Design, create, validate and refine prompts for Large Language Models (LLMs), for different client problems Employ techniques to guide and enhance model responses Develop effective AI interactions through proficient programming and utilization of playgrounds Utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production-ready quality Interface with clients/account team to understand engineering/business problems and translate it into analytics problems that shall deliver insights for action and operational improvements Consume data from multiple sources and present relevant information in a crisp and digestible manner that delivers valuable insights to both technical and non-technical audiences Mentor junior prompt engineers in both technical and softer aspects of the role Qualification Who we are looking for 5+ years experience in data-driven techniques including exploratory data analysis and data pre-processing, machine learning to solve business problems Bachelor's/Masters degree in Mathematics, Statistics, Economics, Computer Science, or related field Solid foundation in Statistical Modeling, Machine Learning algorithms, GenAI, LLMs, RAG architecture and Lang chain frameworks Proficiency in programming languages such as Python, PySpark, SQL or Scala Strong communication and presentation skills to effectively convey complex data insights and recommendations to clients and stakeholders In-depth knowledge and hands-on experience with Azure, AWS or Databricks tools. Relevant certifications in Azure are highly desirable Prior Insurance industry experience is preferred
Posted 15 hours ago
6.0 - 11.0 years
5 - 9 Lacs
Hyderabad
Work from Office
6+ years of experience in Data engineering projects using COSMOS DB- Azure Databricks (Min 3-5 projects) Strong expertise in building data engineering solutions using Azure Databricks, Cosmos DB Strong T-SQL programming skills or with any other flavor of SQL Experience working with high volume data, large objects, complex data transformations Experience working in DevOps environments integrated with GIT for version control and CI/CD pipeline. Good understanding of data modelling for data warehouse and data marts Strong verbal and written communication skills Ability to learn, contribute and grow in a fast phased environment Nice to have: Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, ADLS Gen2, Azure Events Hub Experience using Jira and ServiceNow in project environments Experience in implementing Datawarehouse and ETL solutions
Posted 15 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The data bricks job market in India is flourishing, with a high demand for professionals skilled in data bricks technology. Companies across various industries are leveraging data bricks to manage and analyze their data effectively. Job seekers with expertise in data bricks can explore a multitude of exciting career opportunities in India.
Here are the top 5 major cities actively hiring for data bricks roles in India: - Bangalore - Pune - Hyderabad - Chennai - Mumbai
The average salary range for data bricks professionals in India varies based on experience levels. Entry-level positions can expect a salary ranging from INR 4-6 lakhs per annum, while experienced professionals can earn up to INR 15-20 lakhs per annum.
A typical career progression in data bricks may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Engineer, Data Architect, or Data Scientist.
In addition to expertise in data bricks, professionals in this field are often expected to have skills in: - Apache Spark - Python - SQL - Data warehousing - Data visualization tools
As you embark on your journey to explore data bricks jobs in India, remember to equip yourself with the necessary skills and knowledge to stand out in the competitive job market. Prepare diligently, showcase your expertise confidently, and seize the exciting opportunities that await you in the realm of data bricks. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane