Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
4 - 7 Lacs
Gurugram
Work from Office
At least 6-8 yrs of experience in ETL Testing with Automation Testing Expert in database testing using SQL. Must have worked on Databricks and aware of Databricks related concepts Check the data source locations and formats, perform a data count, and verify that the columns and data types meet the requirements. Test the accuracy of the data, and its completeness. Identify key ETL mapping scenarios and create SQL queries that simulate the scenarios. Should be able to develop and execute test plans, test cases, test scripts. Experience in writing complex SQL queries and validation of Enterprise Data Warehouse Applications Understanding of data model, ETL architecture, Data Warehouse concepts. Must have worked on Agile Methodology Good to have exposure to pyspark.
Posted 1 day ago
6.0 - 8.0 years
8 - 10 Lacs
Gurugram
Work from Office
At least 6-8 yrs of experience in ETL Testing with Automation Testing Expert in database testing using SQL. Must have worked on Databricks and aware of Databricks related concepts Check the data source locations and formats, perform a data count, and verify that the columns and data types meet the requirements. Test the accuracy of the data, and its completeness. Identify key ETL mapping scenarios and create SQL queries that simulate the scenarios. Should be able to develop and execute test plans, test cases, test scripts. Experience in writing complex SQL queries and validation of Enterprise Data Warehouse Applications Understanding of data model, ETL architecture, Data Warehouse concepts. Must have worked on Agile Methodology Good to have exposure to pyspark
Posted 1 day ago
4.0 - 5.0 years
6 - 7 Lacs
Bengaluru
Work from Office
Develop and optimize data pipelines using Databricks and PySpark. Process large-scale data for analytics and reporting. Implement best practices for ETL and data warehousing.
Posted 1 day ago
10.0 - 17.0 years
12 - 22 Lacs
Gurugram
Work from Office
We know the importance that food plays in people's lives the power it has to bring people, families and communities together. Our purpose is to bring enjoyment to people’s lives through great tasting food, in a way which reflects our values. McCain has recently committed to implementing regenerative agriculture practices across 100 percent of our potato acreage by 2030. Ask us more about our commitment to sustainability. OVERVIEW McCain is embarking on a digital transformation. As part of this transformation, we are making significant investments into our data platforms, common data models. data structures and data policies to increase the quality of our data, the confidence of our business teams to use this data to make better decisions and drive value through the use of data. We have a new and ambitious global Digital & Data group, which serves as a resource to the business teams in our regions and global functions. We are currently recruiting an experienced Data Architect to build enterprise data model McCain. JOB PURPOSE: Reporting to the Data Architect Lead, Global Data Architect will take a lead role in creating the enterprise data model for McCain Foods, bringing together data assets across agriculture, manufacturing, supply chain and commercial. This data model will be the foundation for our analytics program that seeks to bring together McCain’s industry-leading operational data sets, with 3rd party data sets, to drive world-class analytics. Working with a diverse team of data governance experts, data integration architects, data engineers and our analytics team including data scientist, you will play a key role in creating a conceptual, logical and physical data model that underpins the Global Digital & Data team’s activities. . JOB RESPONSIBILITIES: Develop an understanding of McCain’s key data assets and work with data governance team to document key data sets in our enterprise data catalog Work with business stakeholders to build a conceptual business model by understanding the business end to end process, challenges, and future business plans. Collaborate with application architects to bring in the analytics point of view when designing end user applications. Develop Logical data model based on business model and align with business teams Work with technical teams to build physical data model, data lineage and keep all relevant documentations Develop a process to manage to all models and appropriate controls With a use-case driven approach, enhance and expand enterprise data model based on legacy on-premises analytics products, and new cloud data products including advanced analytics models Design key enterprise conformed dimensions and ensure understanding across data engineering teams (including third parties); keep data catalog and wiki tools current Primary point of contact for new Digital and IT programs, to ensure alignment to enterprise data model Be a clear player in shaping McCain’s cloud migration strategy, enabling advanced analytics and world-leading Business Intelligence analytics Work in close collaboration with data engineers ensuring data modeling best practices are followed MEASURES OF SUCCESS: Demonstrated history of driving change in a large, global organization A true passion for well-structured and well-governed data; you know and can explain to others the real business risk of too many mapping tables You live for a well-designed and well-structured conformed dimension table Focus on use-case driven prioritization; you are comfortable pushing business teams for requirements that connect to business value and also able to challenge requirements that will not achieve the business’ goals Developing data models that are not just elegant, but truly optimized for analytics, both advanced analytics use cases and dashboarding / BI tools A coaching mindset wherever you go, including with the business, data engineers and other architects A infectious enthusiasm for learning: about our business, deepening your technical knowledge and meeting our teams Have a “get things done” attitude. Roll up the sleeves when necessary; work with and through others as needed KEY QUALIFICATION & EXPERIENCES: Data Design and Governance At least 5 years of experience with data modeling to support business process Ability to design complex data models to connect and internal and external data Nice to have: Ability profile the data for data quality requirements At least 8 years of experience with requirement analysis; experience working with business stakeholders on data design Experience on working with real-time data. Nice to have: experience with Data Catalog tools Ability to draft accurate documentation that supports the project management effort and coding Technical skills At least 5 years of experience designing and working in Data Warehouse solutions building data models; preference for having S4 hana knowledge. At least 2 years of experience in visualization tools preferably Power BI or similar tools.e At least 2 years designing and working in Cloud Data Warehouse solutions; preference for Azure Databricks, Azure Synapse or earlier Microsoft solutions Experience Visio, Power Designer, or similar data modeling tools Nice to have: Experience in data profiling tools informatica, Collibra or similar data quality tools Nice to have: Working experience on MDx Experience in working in Azure cloud environment or similar cloud environment Must have : Ability to develop queries in SQL for assessing , manipulating, and accessing data stored in relational databases , hands on experience in PySpark, Python Nice to have: Ability to understand and work with unstructured data Nice to have at least 1 successful enterprise-wide cloud migration being the data architect or data modeler. - mainly focused on building data models. Nice to have: Experience on working with Manufacturing /Digital Manufacturing. Nice to have: experience designing enterprise data models for analytics, specifically in a PowerBI environment Nice to have: experience with machine learning model design (Python preferred) Behaviors and Attitudes Comfortable working with ambiguity and defining a way forward. Experience challenging current ways of working A documented history of successfully driving projects to completion Excellent interpersonal skills Attention to the details. Good interpersonal and communication skills Comfortable leading others through change
Posted 1 day ago
5.0 - 8.0 years
16 - 30 Lacs
Kolkata
Hybrid
Data Modeler Hybrid Data Environments Job Summary: We are in search of an experienced Data Modeler who possesses a deep understanding of traditional data stores such as SQL Server and Oracle DB, as well as proficiency in Azure/Databricks cloud environments. The ideal candidate will be adept at comprehending business processes and deriving methods to define analytical data models that support enterprise-level analytics, insights generation, and operational reporting. Key Responsibilities: - Collaborate with business analysts and stakeholders to understand business processes and requirements, translating them into data modeling solutions. - Design and develop logical and physical data models that effectively capture the granularity of data necessary for analytical and reporting purposes. - Migrate and optimize existing data models from traditional on-premises data stores to Azure/Databricks cloud environments, ensuring scalability and performance. - Establish data modeling standards and best practices to maintain the integrity and consistency of the data architecture. - Work closely with data engineers and BI developers to ensure that the data models support the needs of analytical and operational reporting. - Conduct data profiling and analysis to understand data sources, relationships, and quality, informing the data modeling process. - Continuously evaluate and refine data models to accommodate evolving business needs and to leverage new data modeling techniques and cloud capabilities. - Document data models, including entity-relationship diagrams, data dictionaries, and metadata, to provide clear guidance for development and maintenance. - Provide expertise in data modeling and data architecture to support the development of data governance policies and procedures. Qualifications: - Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. - Minimum of 5 years of experience in data modeling, with a strong background in both traditional RDBMS and modern cloud-based data platforms. - Proficiency in SQL and experience with data modelling tools (e.g., ER/Studio, ERwin, PowerDesigner). - Familiarity with Azure cloud services, Databricks, and other big data technologies. - Understanding of data warehousing concepts, including dimensional modeling, star schemas, and snowflake schemas. - Ability to translate complex business requirements into effective data models that support analytical and reporting functions. - Strong analytical skills and attention to detail. - Excellent communication and collaboration abilities, with the capacity to engage with both technical and non-technical stakeholders.
Posted 1 day ago
6.0 - 10.0 years
14 - 19 Lacs
Coimbatore
Work from Office
We are seeking a Senior Data & AI/ML Engineer with deep expertise in GCP, who will not only build intelligent and scalable data solutions but also champion our internal capability building and partner-level excellence.. This is a high-impact role for a seasoned engineer who thrives in designing GCP-native AI/ML-enabled data platforms. You'll play a dual role as a hands-on technical lead and a strategic enabler, helping drive our Google Cloud Data & AI/ML specialization track forward through successful implementations, reusable assets, and internal skill development.. Preferred Qualification. GCP Professional Certifications: Data Engineer or Machine Learning Engineer.. Experience contributing to a GCP Partner specialization journey.. Familiarity with Looker, Data Catalog, Dataform, or other GCP data ecosystem tools.. Knowledge of data privacy, model explainability, and AI governance is a plus.. Work Location: Remote. Key Responsibilities. Data & AI/ML Architecture. Design and implement data architectures for real-time and batch pipelines, leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI, and Cloud Storage.. Lead the development of ML pipelines, from feature engineering to model training and deployment using Vertex AI, AI Platform, and Kubeflow Pipelines.. Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions, CI/CD, and Model Registry.. Define and implement data governance, lineage, monitoring, and quality frameworks.. Google Cloud Partner Enablement. Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions.. Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP.. Contribute to building repeatable solution accelerators in Data & AI/ML.. Work with the leadership team to align with Google Cloud Partner Program metrics.. Team Development. Mentor engineers and data scientists toward achieving GCP certifications, especially in Data Engineering and Machine Learning.. Organize and lead internal GCP AI/ML enablement sessions.. Represent the company in Google partner ecosystem events, tech talks, and joint GTM engagements.. What We Offer. Best-in-class packages.. Paid holidays and flexible time-off policies.. Casual dress code and a flexible working environment.. Opportunities for professional development in an engaging, fast-paced environment.. Medical insurance covering self and family up to 4 lakhs per person.. Diverse and multicultural work environment..
Posted 1 day ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Greetings from tsworks Technologies India Pvt We are hiring for Sr. Data Engineer / Lead Data Engineer, if you are interested please share your CV to mohan.kumar@tsworks.io About This Role tsworks Technologies India Private Limited is seeking driven and motivated Senior Data Engineers to join its Digital Services Team. You will get hands-on experience with projects employing industry-leading technologies. This would initially be focused on the operational readiness and maintenance of existing applications and would transition into a build and maintenance role in the long run. Position: Senior Data Engineer / Lead Data Engineer Experience : 5 to 11 Years Location : Bangalore, India / Remote Mandatory Required Qualification Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Expertise in DevOps and CI/CD implementation Excellent Communication Skills Skills & Knowledge Bachelor's or masters degree in computer science, Engineering, or a related field. 5 to 10 Years of experience in Information Technology, designing, developing and executing solutions. 3+ Years of hands-on experience in designing and executing data solutions on Azure cloud platforms as a Data Engineer. Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Familiarity with Snowflake data platform is a good to have experience. Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required. Expertise in DevOps and CI/CD implementation. Hands-on experience with SQL and NoSQL databases. Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems. Experience with data modelling concepts and practices. Familiarity with data quality, governance, and security best practices. Knowledge of big data technologies such as Hadoop, Spark, or Kafka. Familiarity with machine learning concepts and integration of ML pipelines into data workflows Hands-on experience working in an Agile setting. Is self-driven, naturally curious, and able to adapt to a fast-paced work environment. Can articulate, create, and maintain technical and non-technical documentation. Public cloud certifications are desired.
Posted 1 day ago
8.0 - 10.0 years
8 - 12 Lacs
Chennai, Gurugram
Work from Office
Immediate Openings on Devops Azure _Gurgaon Skill:Devops Azure Notice Period: Immediate . Employment Type: Contract. Job Posting Title: Software Development Specialist (DevOps) Top 3 skills: Microsoft Azure\ Dev Ops \ .Net Work location: Chennai \ Bangalore Shift timings: Involves multiple shifts Experience 7 plus years Qualifications: DevOps skill set with maybe 8-10 yrs and also can manage and provide technical guidance to the current team Skilled on Microsoft Azure cloud - ideally Azure Fundamentals certified OR Computer Science/Information Systems Management degree. Familiar with PaaS and IaaS - VMs, Storage, EventHub, Service Fabric Cluster (SFC), Azure Kubernetes Service (AKS), CosmosDB, SQL Server, IoT Hub, Databricks, KeyVault, Datalake Understand the concept of Internet of Things (IoT) - telemetry, ingestion, processing, data storage, reporting Familiarity with tools - Octopus, Bamboo, Terraform, Azure DevOps, Jenkins, Github, Ansible (Chef and Puppet would be ok as well) Familiarity with container orchestration platform (e.g., Kubernetes) Some experience with Powershell, Bash, Python Understanding the difference between NoSQL and SQL databases, and how to maintain them Understanding of monitoring and logging systems (ELK, Prometheus, Nagios, Zabbix, etc) Independent thinker - why does it break, what can I proactively do to fix it Required Strong English communication (written and oral) skills Release Management: Release Management of new software via Tools Understand release management SOP = QA - Load Test - Stage Environment - PROD Create/Manage monitoring and alerting systems and as needed to meet SLAs Comfortable with both Linux and Windows administration Working in agile teams, build, test and maintain aspects of CICD Pipeline Evangelize with Engineering, Security, and cross functions on Ops Best Practices Firmware release - OTA (over the air) Launch new mobile app / release new version of the existing mobile app - Appstore / Playstore Participate in RCCAs when needed Maintain documentation & best practices (Wiki & Runbooks) Work with teams to set up standard alerts that can be placed in ARMs & CI\CD Support product NPI onboarding Metric gathering on usage & distribution of that data Migration of the service from one platform to another/one service provider to another Participate in early phases of NPIs sprints when Arch tech runways are defined Take part on the tech bridges to support the troubleshooting effort when necessary Periodic audits to ensure no security issues/relevant access to the required team members (user access cleanup) Support continuous delivery of programs in which patches, new versions, and bug fixes are more frequently deployed to end users without sacrificing stability or reliability Support on-call during off-hours crisis Incident Management / Alerting / Monitoring: Responsible for Tier 2 support that includes end-2-end ownership of incidents from the time they enter the service line through closure for connected devices Responsible for 24X7 Major Incident Management support Respond, resolve, & escalate tickets in a timely manner Implement corrective actions needed to mitigate security risks Ensure all tickets requiring follow-up work and/or calls are resolved. (End-2-end incident resolution support) Ensure all the components are within MON purview.
Posted 1 day ago
5.0 - 10.0 years
8 - 15 Lacs
Hyderabad, Bengaluru
Hybrid
Role & responsibilities Data Engineer/ Designer with 4-6 years of experience • Databricks Certified Data Engineer Associate • Must have expert SQL Development experience (atleast 5 years) •Databricks Certified Data Engineer Associate •Must be able to perform Analysis Data Analysis •Must be able to perform SQL Script Analysis to understand the business logic •Must be able to perform Unix jobs analysis •Must have good hands-on experience on writing PL/SQL •Must have good hands-on experience on UNIX Shell scripting •Exposure to Control-M Scheduling. •Exposure to Testing and should be able to efficiently perform Testing. •Should have experience on Git and Jenkins. •Experience using GitHub, Atlassian - Jira/Confluence, Jenkins (or similar CI tools) •Banking Domain experience (3-5 years) •Experience deploying/managing Data and Data Warehouse solutions •Good to have : Data Engineering technologies (e.g. Spark, Hadoop, Kafka) •Data Warehousing (e.g. SQL, OLTP/OLAP/DSS) •Understanding of Solution Development life cycles Collaborative & Persuasive, Self-motivated, Research Oriented, Hands-on, Committed, Always-on learner High performing and diverse team with an unrivalled culture of innovation
Posted 1 day ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
#Employment Type: Contract Skills Azure Data Factory SQL Azure Blob Azure Logic Apps
Posted 1 day ago
6.0 - 8.0 years
10 - 14 Lacs
Mumbai, Hyderabad, Bengaluru
Work from Office
Job type Contract to HIRE Co-ordinate with CoreTech for Linux Server Procurement Co-ordinate with CoreTech to sunset Linux Server Co-ordinate with CoreTech to setup DNS & Load Balancer Application Server Setup & Maintainance File transfer protocol setup & support Disk Space Management SSL Certificate Setup/Renewals & Maintainance Log Rotation Monthly Patching Support Vulnerability Management Troubleshooting of DeloitteOmnia Agent(NO_CONNECTIVITY),GE Infaagents PostgreSQL DB and ODBC connection Issue and ISQL DB connectivity issue for Databricks unixODBC driver on DeloitteOmnia Server. Databricks unixODBC driver installation & Configuration for DeloitteOmnia Server. Maintance and support of DeloitteOmnia Infrastructure. 1.Disk Space Management SSL Certificate Setup/Renewals & Maintainance Log Rotation Monthly Patching Support Vulnerability Management DB Migration support for applications. Application Jobs Schedulings and Jar Deployment Support We will help the app team in migrating their application jobs. Automation Support for Couple of use cases. App/Web services restarts. Production Release Deployment Support. Access (Jenkins,Github Repo). Monitoring: New Relic Synthetic, Infrastructure & Page View Setup & Maintenance. NR Password Updations for FSSO for multiple monitors. Daily Monitoring of NR alerts. Working on NR monitor issues. Maintain NR Insight & Ops Dashboard.
Posted 1 day ago
6.0 - 11.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Experience in Cloud platform, e.g., AWS, GCP, Azure, etc. Experience in distributed technology tools, viz. SQL, Spark, Python, PySpark, Scala Performance Turing Optimize SQL, PySpark for performance Airflow workflow scheduling tool for creating data pipelines GitHub source control tool & experience with creating/ configuring Jenkins pipeline Experience in EMR/ EC2, Databricks etc. DWH tools incl. SQL database, Presto, and Snowflake Streaming, Serverless Architecture
Posted 1 day ago
10.0 - 15.0 years
17 - 22 Lacs
Hyderabad, Pune
Work from Office
Oracle HCM Security Lead (Strategy)1 What You''ll Do Lead the transition to RBAC across Oracle HCM (Core HR, Payroll, Absence, Time, Talent) and downstream systems with complex integrations. Architect an end-to-end access governance framework , covering application, integration, and data warehouse layers including Databricks, OAC/OTBI, and 3rd-party data hubs. Define and standardize personas, access tiers, and Areas of Responsibility (AOR) with business process owners. Partner with data platform and analytics teams to align access policies across structured/unstructured data sources used for reporting, workforce intelligence, and ML. Integrate security policies with Okta and identity management tools , ensuring consistent enforcement across apps and data endpoints. Enable secure self-service analytics by implementing column- and row-level security within platforms like OTBI and Databricks, ensuring compliance with SOX, GDPR, and HIPAA. Manage security lifecycle for Oracle HCM and connected platformsprovisioning, auditing, change control, and SoD enforcement. Serve as the employee & candidate data access security authority , participating in solution design, release planning, and cross-functional governance reviews, consulting with legal, HRBPs, comms, and engineering security where applicable Basic Qualifications 10+ years of experience in enterprise security, application governance, or architecture roles with deep expertise in Oracle Fusion HCM and SaaS integration landscapes. Proven experience designing and implementing enterprise RBAC frameworks , with hands-on involvement across apps and data layers. Deep understanding of big data platforms (Databricks, Snowflake, etc.) and how access, classification, and lineage apply in modern data environments. Experience with analytics platform security including OTBI, OAC, and integration with business intelligence tools. Familiarity with identity federation and access policy integration via Okta, Azure AD, or similar tools. Strong understanding of compliance frameworks (SOX, GDPR, HIPAA) and ability to translate policies into technical access controls. Skilled communicator, capable of aligning technical security strategy with business priorities and presenting to senior leadership. Preferred Qualifications Experience with multi-phase Oracle HCM deployments or Workday-to-Oracle transitions. Exposure to data mesh or federated data ownership models . Background in data pipeline security and governance , especially in Databricks, Spark, or similar platforms. Strong knowledge of RACI, persona-based design , and data domain ownership strategies in global organizations. Demonstrated ability to build security into the SDLC , with tools and controls supporting agile SaaS environments.
Posted 1 day ago
8.0 - 13.0 years
8 - 12 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 S&P Global Market Intelligence The Role: Lead Data Engineer, Application Development The Team a collaborative team of database professionals responsible for building and maintaining data products that powers our clients. The Impact Designing, implementing, and maintaining database systems for Databricks and SQL Server Whats in it for you You'll have the opportunity to work with the latest technologies, learn from experienced professionals, and contribute to the success of high-impact projects Responsibilities Designing, developing, and implementing database systems, including database schemas, stored procedures, and other database objects. Monitoring database performance and optimizing queries to enhance efficiency. Implementing performance tuning strategies and techniques. Documenting database schemas, configurations, and procedures Providing support to users and stakeholders on database-related issues. What Were Looking For Bachelor/Masters Degree in Computer Science, Information Systems or equivalent Minimum 8+ years of strong database development experience Advance SQL programming skills, relational and dimension data modeling Understanding of database performance tuning in large datasets Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Experience in conducting application design and code reviews Proficiency with one or more of the following technologiesObject-oriented programming, Programing Languages ( Java, Scala , Python, C#), Scripting (Bash, Powershell) Extensive knowledge of Database systems (Databricks, SQL Server, Oracle, Snowflake) Experience working in cloud computing environments such as AWS , GCP & Azure Exposure to Orchestration technologies like Airflow & ETL Experience with large scale messaging systems such as Kafka Knowledge of Fundamentals, or financial industry highly preferred About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 1 day ago
7.0 - 12.0 years
6 - 10 Lacs
Noida, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 10 Responsibilities: To work closely with various stakeholders to collect, clean, model and visualise datasets. To create data driven insights by researching, designing and implementing ML models to deliver insights and implement action-oriented solutions to complex business problems To drive ground-breaking ML technology within the Modelling and Data Science team. To extract hidden value insights and enrich accuracy of the datasets. To leverage technology and automate workflows creating modernized operational processes aligning with the team strategy. To understand, implement, manage, and maintain analytical solutions & techniques independently. To collaborate and coordinate with Data, content and modelling teams and provide analytical assistance of various commodity datasets To drive and maintain high quality processes and delivering projects in collaborative Agile team environments. : 7+ years of programming experience particularly in Python 4+ years of experience working with SQL or NoSQL databases. 1+ years of experience working with Pyspark. University degree in Computer Science, Engineering, Mathematics, or related disciplines. Strong understanding of big data technologies such as Hadoop, Spark, or Kafka. Demonstrated ability to design and implement end-to-end scalable and performant data pipelines. Experience with workflow management platforms like Airflow. Strong analytical and problem-solving skills. Ability to collaborate and communicate effectively with both technical and non-technical stakeholders. Experience building solutions and working in the Agile working environment Experience working with git or other source control tools Strong understanding of Object-Oriented Programming (OOP) principles and design patterns. Knowledge of clean code practices and the ability to write well-documented, modular, and reusable code. Strong focus on performance optimization and writing efficient, scalable code. Nice to have: Experience working with Oil, gas and energy markets Experience working with BI Visualization applications (e.g. Tableau, Power BI) Understanding of cloud-based services, preferably AWS Experience working with Unified analytics platforms like Databricks Experience with deep learning and related toolkitsTensorflow, PyTorch, Keras, etc. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today.For more information, visit http://www.spglobal.com/commodity-insights . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Location - Bengaluru,Noida,Uttarpradesh,Hyderabad
Posted 1 day ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
4+ years of hands on experience using Azure Cloud, ADLS, ADF & Databricks Finance Domain Data Stewardship Finance Data Reconciliation with SAP down-stream systems Run/Monitor Pipelines/ Validate the Data Bricks note books Able to interface with onsite/ business stake holders. Python, SQL Hands on Knowledge of Snowflake/DW is desirable.
Posted 1 day ago
6.0 - 8.0 years
8 - 11 Lacs
Hyderabad
Work from Office
Immediate Job Openings on #Big Data Engineer _ Pan India_ Contract Experience: 6 +Years Skill:Big Data Engineer Location: Pan India Notice Period:Immediate. Employment Type: Contract Pyspark Azure Data Bricks Experience on workflows Unity catalog Managed / external data with delta tables.
Posted 1 day ago
7.0 - 12.0 years
8 - 12 Lacs
Hyderabad
Work from Office
As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data and reporting solutions on cloud primarily using Microsoft Azure platform and Power BI. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. This position will design, develop, implement, test, deploy, monitor, and maintain the delivery of data enrichments and reporting models using MS Fabric/PBI infrastructure. Primary Responsibilities: Work with BI team to build and deploy healthcare data enrichments Design and develop high performance reporting models and dashboards using Power BI/MS Fabric Deploy and manage the Power BI dashboard using Power BI service Ensure connectivity with various sources flat files, on-prem databases, Snowflake, Databricks using Live, Direct Query and Import connections Design and develop Azure Databricks jobs using Python & Spark Develop and maintain CI/CD processes using Jenkins, GitHub, Maven Maintain high quality documentation of data definitions, transformations, and processes to ensure data governance and security Continuously explore new Azure, Power BI features and capabilities; assess their applicability to business needs Create detailed documentation for PBI/MS Fabric processes, architecture, and implementation patterns Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Work with business owners to add new enrichments and to design and implement new reporting models Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 7+ years of overall experience in Data & Analytics engineering 5+ years of experience designing, developing and deploying MS Fabric and Power BI dashboards 5+ years of experience working with Azure, Databricks, and Pyspark/Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub etc. In-depth understanding of MS Fabric and Power BI along with deploying and managing dashboards, Apps and Workspaces along with access and security management Proven excellent communication skills Preferred Qualifications: Snowflake/Airflowexperience Experience or knowledge of health care concepts E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes an enterprise priority reflected in our mission. #NIC
Posted 1 day ago
12.0 - 17.0 years
17 - 22 Lacs
Noida
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice under RMI – Optum Advisory umbrella. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Design and implement secure, scalable, and cost-effective cloud data architectures using cloud services such as Azure Data Factory (ADF), Azure Databricks, Azure Storage, Key Vault, Snowflake, Synapse Analytics, MS Fabric/Power BI etc. Define and lead data & cloud strategy, including migration plans, modernization of legacy systems, and adoption of new cloud capabilities Collaborate with clients to understand business requirements and translate them into optimal cloud architecture solutions, balancing performance, security, and cost Evaluate and compare cloud services (e.g., Databricks, Snowflake, Synapse Analytics) and recommend the best-fit solutions based on project needs and organizational goals Lead the full lifecycle of data platform and product implementations, from planning and design to deployment and support Drive cloud migration initiatives, ensuring smooth transition from on-premise systems while engaging and upskilling existing teams Lead and mentor a team of cloud and data engineers, fostering a culture of continuous learning and technical excellence Plan and guide the team in building Proof of Concepts (POCs), exploring new cloud capabilities, and validating emerging technologies Establish and maintain comprehensive documentation for cloud setup processes, architecture decisions, and operational procedures Work closely with internal and external stakeholders to gather requirements, present solutions, and ensure alignment with business objectives Ensure all cloud solutions adhere to security best practices, compliance standards, and governance policies Prepare case studies and share learnings from implementations to build organizational knowledge and improve future projects Building and analyzing data engineering processes and act as an SME to troubleshoot performance issues and suggesting solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions, Maven etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Identifies solutions to non-standard requests and problems Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 12+ years of overall experience in Data & Analytics engineering 10+ years of solid experience working as an Architect designing data platforms using Azure, Databricks, Snowflake, ADF, Data Lake, Synapse Analytics, Power BI etc. 10+ years of experience working with data platform or product using PySpark and Spark-SQL In-depth experience designing complex Azure architecture for various business needs & ability to come up with efficient design & solutions Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. Experience in leading team and people management Highly proficient and hands-on experience with Azure services, Databricks/Snowflake development etc. Excellent communication and stakeholder management skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Eexperience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 1 day ago
7.0 - 12.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 7+ years of overall experience in Data & Analytics engineering 5+ years of experience working with Azure, Databricks, and ADF, Data Lake 5+ years of experience working with data platform or product using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 1 day ago
3.0 - 7.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries ofscience and engineering to make possiblethe next generations of technology, join us to Make Possible a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers.We empower our team to push the boundaries of what is possiblewhile learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities: Supports the design and development of program methods, processes, and systems to consolidate and analyze structured and unstructured, diverse "big data" sources. Interfaces with internal customers for requirements analysis and compiles data for scheduled or special reports and analysis Supports project teams to develop analytical models, algorithms and automated processes, applying SQL understanding and Python programming, to cleanse, integrate and evaluate large datasets. Supports the timely development of products for manufacturing and process information by applying sophisticated data analytics. Able to quickly understand the requirement and create it into executive level presentation slides. Participates in the design, development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. used to drive key business decisions. Strong business & financial (P&L) acumen. Able to understand key themes, financial terms and data points to create appropriate summaries. Works with business intelligence manager and other staff to assess various reporting needs. Analyzes reporting needs and requirements, assesses current reporting in the context of strategic goals and devise plans for delivering the most appropriate reporting solutions to users. Qualification: Bachelors/Masters degree or relevant 7 - 12 years of experience as data analyst Required technical skills in SQL, Azure, Python, Databricks, Tableau (good to have) PowerPoint and Excel expertise Experience in Supply Chain domain. Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and basic knowledge of related disciplines. Business Expertise Has knowledge of best practices and how own area integrated with others; is aware of the competition and the factors that differentiate them in the market. Leadership Acts as a resource for colleagues with less experience; may lead small projects with manageable risks and resource requirements. Problem Solving Solves complex problems; takes a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information. Impact Impacts a range of customer, operational, project or service activities within own team and other related teams; works within broad guidelines and policies. Interpersonal Skills Explains difficult or sensitive information; works to build consensus. Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 20% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 1 day ago
8.0 - 13.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries ofscience and engineering to make possiblethe next generations of technology, join us to Make Possible a Better Future. What We Offer Location: Bangalore,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers.We empower our team to push the boundaries of what is possiblewhile learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Key Responsibilities Provide technical support for applications built using .Net as well as Angular, React and other open source technologies. Troubleshoot and resolve issues related to Front End, APIs and backend services. Collaborate with development teams to understand and resolve technical issues Assist in the deployment and maintenance of software applications. Ensure the performance, quality, and responsiveness of applications and apply permanent fixes to the critical and recurring issues Help maintain code quality, organization, and automation. Perform design reviews with the respective development for critical applications and provide inputs Document support processes and solutions for future reference. Stay up-to-date with the latest industry trends and technologies. Required Skills and Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 8+ years of experience in software development and support. Strong proficiency in .Net, Angular, React, Proficient in Python for backend support Familiarity in Hadoop Ecosystem as well as Databricks Experience with RESTful APIs and web services. Solid understanding of front-end technologies, including HTML5, CSS3, and JavaScript as well as Azure, AWS Strong Background in SQL Server and other relational databases Familiarity with version control systems (e.g., Git) as well as Atlassian Products for Software Development and Code Deployment Mechanisms/DevOps Best practices in hosting the applications in containerized platforms like OCP (onprem and cloud) etc Experience with open-source projects and contributions. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Certifications in relevant areas specially Microsoft will be a plus Functional Knowledge Demonstrates conceptual and practical expertise in own discipline and knowledge of Semiconductor industry is nice to have interpersonal Skills Explains difficult or sensitive information; works to build consensus Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 1 day ago
6.0 - 11.0 years
8 - 14 Lacs
Hyderabad
Work from Office
Bachelors degree (Computer Science), masters degree or Technical Diploma or equivalent At least 8 years of experience in a similar role At least 5 years of experience on AWS and/or Azure At least 5 years of experience on Databricks At least 5 years of experience on multiples Azure and AWS PaaS solutions: Azure Data Factory, MSSQL, Azure storage, AWS S3, Cognitive search, CosmosDB, Event Hub, AWS glue Strong knowledge of AWS and Azure architecture design best practices Knowledge of ITIL & AGILE methodologies (certifications are a plus) Experience working with DevOps tools such as Git, CI/CD pipelines, Ansible, Azure DevOps Knowledge of Airflow, Kubernetes is an added advantage Solid understanding of Networking/Security and Linux English language on the Business Fluent level is required Curious to continuously learn and explore new approaches/technologies Able to work under pressure in a multi-vendor and multi-cultural team Flexible, agile and adaptive to change Customer-focused approach Good communication skills Analytical mind-set Innovation
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Data Engineer ensuring the smooth functioning of our applications and data systems. Your expertise in Data Ingestion, Release Management, Monitorization, Incident Review, Databricks, Azure Cloud, and Data Analysis will be instrumental in maintaining the reliability, availability, and performance of our applications and data pipelines. You will collaborate closely with cross-functional teams to support application deployments, monitor system health, analyze data, and provide timely resolutions to incidents. The ideal candidate should have a strong background in Azure DevOps, Azure Cloud specially ADF, Databricks, and AWS Cloud. List of Key Responsibilities: Implement and manage data ingestion processes to acquire data from various sources and ensure its accuracy and completeness in our systems. Collaborate with development and operations teams to facilitate the release management process, ensuring successful and efficient deployment of application updates and enhancements. Monitor the performance and health of applications and data pipelines, promptly identifying and addressing any anomalies or potential issues. Respond to incidents and service requests in a timely manner, conducting thorough incident reviews to identify root causes and implementing effective solutions to prevent recurrence. Utilize Databricks and monitoring tools to analyze application logs, system metrics, and data to diagnose and troubleshoot issues effectively. Analyze data-related issues, troubleshoot data quality problems, and propose solutions to optimize data workflows. Utilize Azure Cloud services to deploy and manage applications and data infrastructure efficiently. Document incident reports, resolutions, and support procedures for knowledge sharing and future reference. Continuously improve support processes and workflows to enhance efficiency, minimize downtime, and improve the overall reliability of applications and data systems. Stay up-to-date with the latest technologies and industry best practices related to application support, data analysis, and cloud services. Technical Knowledge: Technology Level of expertise* Priority Must Nice to have Scala X Spark X Azure Cloud Senior yes X AWS Cloud X Python X Databricks Senior yes X ADF yes X Rstudio/Rconnect Junior X
Posted 2 days ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Immediate Openings on DotNet Developer _ Bangalore _Contract Skill: DotNet Developer Notice Period: Immediate . Employment Type: Contract Job Description Bachelor's degree in computer science, Information Systems, or other relevant subject area or equivalent experience ' 8-10+ years of experience in the skills .net framework,.net core,asp.net, vb.net, html, web serv ce,web api,SharePoint,power automate,Microsoft apps,mysql,sqlserver Client and Server Architecture and maintain code base via GitHub would be added benefit Robost SQL knowledge such as complex nested queries,procedure and triggers Good to have skills from Data tools perspective :Pyspark,Athena,Databricks, AWS Redshift technologies to analyse and bring data into Data Lake.Knowledge of building reports and power Bl Good knowledge of business processes, preferably knowledge of related modules and strong cross modular skills incl. interfaces. Expert application and customizing knowledge for used standard software and other regional solutions in the assigned module. . Ability to absorb sophisticated technical information and communicate effectively to both technical and business audiences. Knowledge of applicable data privacy practices and laws.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane