Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
16.0 - 21.0 years
4 - 8 Lacs
kolkata
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP HANA DB Administration, PostgreSQL Administration, Hadoop Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 16 years full time educationCloud Database Engineer HANARequired Skills: SAP HANA Database Administration - Knowledge of clustering, replication, and load balancing techniques to ensure database availability and reliabilityProficiency in monitoring and maintaining the health and performance of high availability systemsExperience with public cloud platforms such as GCP, AWS, or AzureStrong troubleshooting skills and the ability to provide effective resolutions for technical issuesDesired Skills: Understanding of Cassandra, Ansible, Terraform, Kafka, Redis, Hadoop or Postgres. Growth and product mindset and a strong focus on automation. Working knowledge of Kubernetes for container orchestration and scalability. Activities:Collaborate closely with cross-functional teams to gather requirements and support SAP teams to execute database initiatives. Automate the provisioning and configuration of cloud infrastructure, ensuring efficient and reliable deployments. Provide operational support to monitor database performance, implement changes, and apply new patches and versions when required and previously agreed . Act as the point of contact for escalated technical issues with our Engineering colleagues, demonstrating deep troubleshooting skills to provide effective resolutions to unblock our partners. :Bachelors degree in computer science, Engineering, or a related field. Proven experience in planning, deploying, supporting, and optimizing highly scalable and resilient SAP HANA database systems. Ability to collaborate effectively with cross-functional teams to gather requirements and convert them into measurable scopes. troubleshooting skills and the ability to provide effective resolutions for technical issues. Familiarity with public cloud platforms such as GCP, AWS, or Azure. Understands Agile principles and methodologies. Qualification 16 years full time education
Posted 5 days ago
5.0 - 10.0 years
7 - 11 Lacs
hyderabad, chennai, bengaluru
Work from Office
The role involves performing Big Data Administration and Engineering activities on multiple open-source platforms such as Hadoop, Kafka, HBase, and Spark. The successful candidate will possess strong troubleshooting and debugging skills. Other responsibilities include effective root cause analysis of major production incidents and the development of learning documentation. The person will identify and implement high-availability solutions for services with a single point of failure. The role involves planning and performing capacity expansions and upgrades in a timely manner to avoid any scaling issues and bugs. This includes automating repetitive tasks to reduce manual effort and prevent human errors. The successful candidate will tune alerting and set up observability to proactively identify issues and performance problems. They will also work closely with Level-3 teams in reviewing new use cases and cluster hardening techniques to build robust and reliable platforms. The role involves creating standard operating procedure documents and guidelines on effectively managing and utilizing the platforms. The person will leverage DevOps tools, disciplines (Incident, problem, and change management), and standards in day-to-day operations. The individual will ensure that the Hadoop platform can effectively meet performance and service level agreement requirements. They will also perform security remediation, automation, and self-healing as per the requirement. The individual will concentrate on developing automations and reports to minimize manual effort. This can be achieved through various automation tools such as Shell scripting, Ansible, or Python scripting, or by using any other programming language Job location: Bangalore, Chennai, Hyderabad, Pune
Posted 5 days ago
2.0 - 5.0 years
5 - 9 Lacs
chennai
Work from Office
About The Role Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Alteryx Good to have skills : Hadoop AdministrationMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in brainstorming sessions to explore innovative solutions, ensuring that the applications align with business objectives and enhance user experience. Additionally, you will participate in testing and validation processes to ensure the applications function as intended, while also providing support and guidance to team members throughout the development lifecycle. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation for application designs and specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Alteryx.- Good To Have Skills: Experience with Hadoop Administration.- Strong analytical skills to assess business requirements and translate them into technical specifications.- Experience in application design methodologies and best practices.- Familiarity with data integration and transformation processes. Additional Information:- The candidate should have minimum 3 years of experience in Alteryx.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 5 days ago
5.0 - 10.0 years
7 - 11 Lacs
pune, chennai, bengaluru
Work from Office
Hadoop admin support experience is a key criterion. Every resource will be expected to have at least 2 years of support experience. Visas Hadoop clusters span 1000s of nodes per cluster. Experience of having at least 500+ node cluster for at least 3 5 years Hadoop Admin support experience will be expected for the senior resources. Hadoop administration Automation (Ansible, shell scripting or python scripting) DEVOPS skills (Should be able to code at least in one language preferably python Program/Project Overview Role is part of PRE-Big Data team responsible for managing Hadoop platforms. Resource will work during IND hours, and it is hybrid role. Candidates will focus on improving performance, reliability and improving the efficiency of Big Data platforms. Engagement Deliverable(s) The role involves performing Big Data Administration and Engineering activities on multiple open-source platforms such as Hadoop, Kafka, HBase, and Spark. The successful candidate will possess strong troubleshooting and debugging skills. Other responsibilities include effective root cause analysis of major production incidents and the development of learning documentation. The person will identify and implement high-availability solutions for services with a single point of failure. The role involves planning and performing capacity expansions and upgrades in a timely manner to avoid any scaling issues and bugs. This includes automating repetitive tasks to reduce manual effort and prevent human errors. The successful candidate will tune alerting and set up observability to proactively identify issues and performance problems. They will also work closely with Level 3 teams in reviewing new use cases and cluster hardening techniques to build robust and reliable platforms. The role involves creating standard operating procedure documents and guidelines on effectively managing and utilizing the platforms. The person will leverage DevOps tools, disciplines (Incident, problem, and change management), and standards in day-to-day operations. The individual will ensure that the Hadoop platform can effectively meet performance and service level agreement requirements. They will also perform security remediation, automation, and self-healing as per the requirement. The individual will concentrate on developing automations and reports to minimize manual effort. This can be achieved through various automation tools such as Shell scripting, Ansible, or Python scripting, or by using any other programming language
Posted 5 days ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
As a Hadoop Admin, you will be responsible for managing and supporting Hadoop clusters and various components such as HDFS, HBase, Hive, Sentry, Hue, Yarn, Sqoop, Spark, Oozie, ZooKeeper, Flume, and Solr. With a minimum of 4 years of experience in Hadoop administration, you will play a crucial role in installing, configuring, maintaining, troubleshooting, and monitoring these clusters to ensure their efficient functioning in production support projects. Your primary duties will include integrating analytical tools like Datameer, Paxata, DataRobot, H2O, MRS, Python, R-Studio, SAS, and Dataiku-Bluedata with Hadoop, along with conducting job level troubleshooting for components such as Yarn, Impala, and others. Proficiency in Unix/Linux and scripting is essential for this role, and you should also have experience with tools like Talend, MySQL Galera, Pepperdata, Autowatch, Netbackup, Solix, UDeploy, and RLM. Additionally, you will be tasked with troubleshooting application issues across various environments and operating platforms to ensure smooth operations. The ideal candidate for this position should have 4 to 6 years of relevant experience, strong knowledge of Hadoop administration, and the ability to excel in a fast-paced and dynamic work environment. Our hiring process consists of screening conducted by the HR team, followed by two technical rounds, and culminating in a final HR round. If you are passionate about Big Data and possess the required skills and experience for this role, we invite you to join our team as a Hadoop Admin and contribute to our exciting projects in Gurgaon, Bangalore, and Hyderabad.,
Posted 1 week ago
16.0 - 21.0 years
4 - 8 Lacs
Kolkata
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP HANA DB Administration, PostgreSQL Administration, Hadoop Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 16 years full time educationCloud Database Engineer HANARequired Skills: SAP HANA Database Administration - Knowledge of clustering, replication, and load balancing techniques to ensure database availability and reliabilityProficiency in monitoring and maintaining the health and performance of high availability systemsExperience with public cloud platforms such as GCP, AWS, or AzureStrong troubleshooting skills and the ability to provide effective resolutions for technical issuesDesired Skills: Understanding of Cassandra, Ansible, Terraform, Kafka, Redis, Hadoop or Postgres. Growth and product mindset and a strong focus on automation. Working knowledge of Kubernetes for container orchestration and scalability. Activities:Collaborate closely with cross-functional teams to gather requirements and support SAP teams to execute database initiatives. Automate the provisioning and configuration of cloud infrastructure, ensuring efficient and reliable deployments. Provide operational support to monitor database performance, implement changes, and apply new patches and versions when required and previously agreed . Act as the point of contact for escalated technical issues with our Engineering colleagues, demonstrating deep troubleshooting skills to provide effective resolutions to unblock our partners. Requirements:Bachelors degree in computer science, Engineering, or a related field. Proven experience in planning, deploying, supporting, and optimizing highly scalable and resilient SAP HANA database systems. Ability to collaborate effectively with cross-functional teams to gather requirements and convert them into measurable scopes. troubleshooting skills and the ability to provide effective resolutions for technical issues. Familiarity with public cloud platforms such as GCP, AWS, or Azure. Understands Agile principles and methodologies. Qualification 16 years full time education
Posted 1 month ago
3.0 - 6.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 1 month ago
3.0 - 6.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 1 month ago
5.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
About the Role As an SRE (5 to 7 years) (Big Data) Engineer at PhonePe, you will be responsible for ensuring the stability, scalability, and performance of distributed systems operating at scale. You will collaborate with development, infrastructure, and data teams to automate operations, reduce manual efforts, handle incidents, and continuously improve system reliability. This role requires strong problem-solving skills, operational ownership, and a proactive approach to mentoring and driving engineering excellence. Roles and Responsibilities Ensure the ongoing stability, scalability, and performance of PhonePes Hadoop ecosystem and associated services. Manage and administer Hadoop infrastructure including HDFS, HBase, Hive, Pig, Airflow, YARN, Ranger, Kafka, Pinot, and Druid. Automate BAU operations through scripting and tool development. Perform capacity planning, system tuning, and performance optimization. Set-up, configure, and manage Nginx in high-traffic environments. Administration and troubleshooting of Linux + Bigdata systems, including networking (IP, Iptables, IPsec). Handle on-call responsibilities, investigate incidents, perform root cause analysis, and implement mitigation strategies. Collaborate with infrastructure, network, database, and BI teams to ensure data availability and quality. Apply system updates, patches, and manage version upgrades in coordination with security teams. Build tools and services to improve observability, debuggability, and supportability. Participate in Kerberos and LDAP administration. Experience in capacity planning and performance tuning of Hadoop clusters. Work with configuration management and deployment tools like Puppet, Chef, Salt, or Ansible. Skills Required Minimum 1 year of Linux/Unix system administration experience. Over 4 years of hands-on experience in Hadoop administration. Minimum 1 years of experience managing infrastructure on public cloud platforms like AWS, Azure, or GCP (optional ) . Strong understanding of networking, open-source tools, and IT operations. Proficient in scripting and programming (Perl, Golang, or Python). Hands-on experience with maintaining and managing the Hadoop ecosystem components like HDFS, Yarn, Hbase, Kafka . Strong operational knowledge in systems (CPU, memory, storage, OS-level troubleshooting). Experience in administering and tuning relational and NoSQL databases. Experience in configuring and managing Nginx in production environments. Excellent communication and collaboration skills. Good to Have Experience designing and maintaining Airflow DAGs to automate scalable and efficient workflows. Experience in ELK stack administration. Familiarity with monitoring tools like Grafana, Loki, Prometheus, and OpenTSDB. Exposure to security protocols and tools (Kerberos, LDAP). Familiarity with distributed systems like elasticsearch or similar high-scale environments. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 1 month ago
8.0 - 10.0 years
15 - 27 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Please find the below detailed JD for Bigdata Administrator Key Responsibilities: Lead CDP platform upgrades and migrations , with strong hands-on execution and documentation from planning to go-live. Administer and tune Hadoop ecosystem services: Core: HDFS, YARN, Hive, Hue, Impala, Sqoop, Oozie Streaming: Apache Kafka (broker/topic ops), Apache Flink (streaming jobs) NoSQL/Query: HBase, Phoenix Security: Kerberos, Ranger, LDAP, TLS Manage Cribl Stream deployments: build, configure, secure, and optimize data routing pipelines. Monitor and optimize platform performance using Cloudera Manager, NewRelic, BigPanda, Prometheus, Grafana , or any other observability tools. Design and implement backup, recovery, HA, and DR strategies for critical data infrastructure. Automate platform operations using Python, Bash/Shell, Scala , and CI/CD workflows. Work cross-functionally with Data Engineers, DevOps, InfoSec, and Cloud Engineering teams to support data pipeline reliability and scalability. Manage deployments using Docker , Kubernetes , Jenkins , Bitbucket , and optionally Ansible or GitOps practices. Support and maintain cloud-native or hybrid deployments, especially in GCP (Anthos) environments. Produce and maintain robust architecture documentation, runbooks, and operational SOPs. Required Qualifications: 7+ years of experience in Big Data infrastructure, administration, and operations. Proven Cloudera CDP (7.x) experience, including production-grade migrations (7.1.6 to 7.1.9+). Deep expertise in: Apache Spark job tuning, executor/resource optimization Apache Kafka – security (SASL_SSL, GSSAPI), scaling, topic lifecycle management Apache Flink – real-time stream processing in HA environments Cribl Stream – full-lifecycle management and observability integration HBase & Phoenix – schema evolution, read/write tuning, replication Scripting & Automation: Proficient in Python , Shell (Bash) , and optionally Scala Security-first mindset: Working knowledge of Kerberos , Ranger policies , LDAP integration, and TLS configuration. DevOps Experience: Hands-on with Docker , Kubernetes , Jenkins , Bitbucket , and monitoring tools like Grafana/Prometheus . Comfortable supporting large-scale, multi-tenant environments and production on-call rotations. Preferred Qualifications: Cloudera Certified Administrator (CCA) or equivalent industry certification. Experience with BD on-prem , cloud and hybrid data infrastructure , particularly Google Cloud Platform (GCP) and Anthos clusters.
Posted 1 month ago
6.0 - 11.0 years
8 - 15 Lacs
Noida
Work from Office
We are hiring for the position "Hadoop Admin" Skill Set: Hadoop, Cloudera, big data, spark, Hive, HDFS, YARN, HIVE, KAFKA, SPARK, SQL DATABASE, RANGER Experience: 7 years Location: Noida, Sector-135 Work Mode: Work from Office Budget: 14-15 LPA
Posted 1 month ago
5.0 - 8.0 years
4 - 8 Lacs
Kolkata
Work from Office
We are seeking a highly skilled and experienced Hadoop Administrator to join our dynamic team. The ideal candidate will have extensive experience in managing and optimizing Hadoop clusters, ensuring high performance and availability. You will work with a variety of big data technologies and play a pivotal role in managing data integration, troubleshooting infrastructure issues, and collaborating with cross-functional teams to streamline data workflows. Key Responsibilities : - Install, configure, and maintain Hadoop clusters, ensuring high availability, scalability, and performance. - Manage and monitor various Hadoop ecosystem components, including HDFS, YARN, Hive, Impala, and other related technologies. - Oversee the integration of data from Oracle Flexcube and other source systems into the Cloudera Data Platform. - Troubleshoot and resolve complex issues related to Hadoop infrastructure, performance, and applications. - Collaborate with cross-functional teams including data engineers, analysts, and architects to optimize data workflows and processes. - Implement and manage data backup, recovery plans, and disaster recovery strategies for Hadoop clusters. - Perform regular health checks on the Hadoop ecosystem, including managing logs, capacity planning, and system updates. - Develop, test, and optimize scripts to automate system maintenance and data management tasks. - Ensure compliance with internal security policies and industry best practices for data protection. - Provide training and guidance to junior team members and help in knowledge sharing within the team. - Create and maintain documentation related to Hadoop administration processes, system configurations, troubleshooting steps, and best practices. - Stay updated with the latest trends in Hadoop technologies and suggest improvements and new tools as necessary. Qualifications : - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - 5+ years of hands-on experience in Hadoop administration, with a preference for candidates from the banking or financial sectors. - Strong knowledge of Oracle Flexcube, Cloudera Data Platform, Hadoop, Hive, Impala, and other big data technologies. - Proven experience in managing and optimizing large-scale Hadoop clusters, including cluster upgrades and performance tuning. - Expertise in configuring and tuning Hadoop-related services (e.g., HDFS, YARN, MapReduce). - Strong understanding of data security principles and implementation of security protocols within Hadoop. - Excellent analytical, troubleshooting, and problem-solving skills. - Strong communication and interpersonal skills with the ability to work collaboratively within cross-functional teams. - Ability to work independently, manage multiple priorities, and meet deadlines. - Certification in Hadoop administration or related fields is a plus. - Experience with scripting languages such as Python, Shell, or Perl is desirable.
Posted 1 month ago
5.0 - 7.0 years
4 - 8 Lacs
Hyderabad
Work from Office
We are looking for a skilled Hadoop Administrator with 5 to 7 years of experience in Hadoop Engineering, working on Python, Ansible, and DevOps methodologies. The ideal candidate will have extensive experience in CDPHDP Cluster and Server build, including Control nodes, Worker nodes, Edge nodes, and Data copy from cluster to cluster. Roles and Responsibility Design and implement scalable and efficient data processing systems using Hadoop technologies. Develop and maintain automation scripts using Python, Ansible, and other DevOps tools. Collaborate with cross-functional teams to identify and prioritize project requirements. Troubleshoot and resolve complex technical issues related to Hadoop clusters. Ensure high-quality standards for data processing and security. Participate in code reviews and contribute to the improvement of the overall codebase. Job Strong understanding of Hadoop ecosystem, including HDFS, MapReduce, and YARN. Experience with Linux operating system and scripting languages such as Bash or Python. Proficient in Shell scripting and YAML configuration files. Good technical design, problem-solving, and debugging skills. Understanding of CI/CD concepts and familiarity with GitHub, Jenkins, and Ansible. Hands-on development solutions using industry-leading Cloud technologies. Working knowledge of Git Ops and DevSecOps. Agile proficient and knowledgeable in other agile methodologies, ideally certified. Strong communication and networking skills. Ability to work autonomously and take accountability to execute and deliver on goals. Strong commitment to high-quality standards. Good communication skills and sense of ownership to work as an individual contributor.
Posted 1 month ago
8.0 - 13.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Experience in SQL and understanding of ETL best practices Should have good hands on in ETL/Big Data development Extensive hands on experience in Scala Should have experience in Spark/Yarn, troubleshooting Spark, Linux, Python Setting up a Hadoop cluster, Backup, recovery, and maintenance.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Big data (Hadoop and Spark) skills. Programming language: Python, Scala Job requirement This position is for a mid-level data engineer with development experience who will focus on creating new capabilities in the Risk space while maturing our code base and development processes. Qualifications: 3 or more years of work experience with a bachelors degree or more than 2 years of work experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) Experience in creating data driven business solutions and solving data problems using a wide variety of technologies such as Hadoop, Hive, Spark, MongoDB, NoSQL, as well as traditional data technologies like RDBMS, MySQL a plus Ability to program in one or more scripting languages such as Perl or Python and one or more programming languages such as Java or Scala Experience with data visualization and business intelligence tools like Tableau is a plus Experience with or knowledge of Continuous Integration & Development and automation tools such as Jenkins, Artifactory, Git etc. Experience with or knowledge of Agile and Test-Driven Development methodology Strong analytical skills with excellent problem-solving ability
Posted 1 month ago
5.0 - 10.0 years
11 - 15 Lacs
Ahmedabad
Work from Office
Project Role : Business Process Architect Project Role Description : Design business processes, including characteristics and key performance indicators (KPIs), to meet process and functional requirements. Work closely with the Application Architect to create the process blueprint and establish business process requirements to drive out application requirements and metrics. Assist in quality management reviews, ensure all business and design requirements are met. Educate stakeholders to ensure a complete understanding of the designs. Must have skills : Data Analytics, Data Warehouse ETL Testing, Big Data Analysis Tool and Techniques, Hadoop Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Specific undergraduate qualifications ie engineering computer science Summary :Experienced Data Engineer with a strong background in Azure data services and broadcast supply chain ecosystems. Skilled in OTT streaming protocols, cloud technologies, and project management. Roles & Responsibilities:- Proven experience as a Data Engineer or in a similar role.- Lead and support expert guidance to Principal - Solutions & Integration.- Track and report on project progress using internal applications.- Transition customer requirements to on-air operations with proper documentation.- Scope projects and ensure adherence to budgets and timelines.- Generate design and integration documentation. Professional & Technical Skills: - Strong proficiency in Azure data services (Azure Data Factory, Azure Databricks, Azure SQL Database).- Experience with SQL, Python, and big data tools (Hadoop, Spark, Kafka).- Familiarity with data warehousing, ETL techniques, and microservices in a cloud environment.- Knowledge of broadcast supply chain ecosystems (BMS, RMS, MAM, Playout, MCR/PCR, NLE, Traffic).- Experience with OTT streaming protocols, DRM, and content delivery networks.- Working knowledge of cloud technologies (Azure, Docker, Kubernetes, AWS Basics, GCP Basics).- Basic understanding of AWS Media Services (Media Connect, Elemental, MediaLive, Media Store, Media 2 Cloud, S3, Glacier). Additional Information:- Minimum of 5 years' experience in Data Analytics disciplines.- Good presentation and documentation skills.- Excellent interpersonal skills.- Undergraduate qualifications in engineering or computer science.Networking:Apply basic networking knowledge including TCP/IP, UDP/IP, IGMP, DHCP, DNS, and LAN/WAN technologies to support video delivery systems.Highly Desirable:- Experience in defining technical solutions with over 99.999% reliability. Qualification Specific undergraduate qualifications ie engineering computer science
Posted 2 months ago
8.0 - 12.0 years
14 - 15 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: 8 -12 Years Experience in Hadoop Administration with working experience on Python, Ansible DevOps methodologies. Hadoop Admin is responsible for building different kinds of solutions on Big Data Platform CDP and ODP Environment Builds Python and Ansible Automations and DevSecOps Contributions Use Agile /DevOps methodology to deliver quality software. Guide team members in arriving at and delivering the right solutions. Monitor and improve the performance of Hadoop platforms. CDP / ODP Migration / Upgrade experience Requirements To be successful in this role, you should meet the following requirements: Big data eco system and Hadoop administration knowledge. Also knowledgeable about Active Directory and Centrify. Working Knowledge on Python, Ansible and CI/CD tools. Coordinating with vendors and business teams during environment outages. Have experience development / coding experience (Java / Python / Groove / Shell scripting). Comfortable dealing with frequent testing and incremental releases. Understanding of Ops challenges and how they can be addressed during design and development. Soft skills for better collaboration across the team.
Posted 2 months ago
16.0 - 21.0 years
4 - 8 Lacs
Kolkata
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP HANA DB Administration, PostgreSQL Administration, Hadoop Administration, Ansible on Microsoft Azure Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 16 years full time educationCloud Database Engineer HANARequired Skills: SAP HANA Database Administration - Knowledge of clustering, replication, and load balancing techniques to ensure database availability and reliabilityProficiency in monitoring and maintaining the health and performance of high availability systemsExperience with public cloud platforms such as GCP, AWS, or AzureStrong troubleshooting skills and the ability to provide effective resolutions for technical issuesDesired Skills: Understanding of Cassandra, Ansible, Terraform, Kafka, Redis, Hadoop or Postgres. Growth and product mindset and a strong focus on automation. Working knowledge of Kubernetes for container orchestration and scalability. Activities:Collaborate closely with cross-functional teams to gather requirements and support SAP teams to execute database initiatives. Automate the provisioning and configuration of cloud infrastructure, ensuring efficient and reliable deployments. Provide operational support to monitor database performance, implement changes, and apply new patches and versions when required and previously agreed . Act as the point of contact for escalated technical issues with our Engineering colleagues, demonstrating deep troubleshooting skills to provide effective resolutions to unblock our partners. :Bachelors degree in computer science, Engineering, or a related field. Proven experience in planning, deploying, supporting, and optimizing highly scalable and resilient SAP HANA database systems. Ability to collaborate effectively with cross-functional teams to gather requirements and convert them into measurable scopes. troubleshooting skills and the ability to provide effective resolutions for technical issues. Familiarity with public cloud platforms such as GCP, AWS, or Azure. Understands Agile principles and methodologies. Qualification 16 years full time education
Posted 2 months ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to resolve, maintain and manage clients software/ hardware/ network based on the service requests raised from the end-user as per the defined SLAs ensuring client satisfaction Do Ensure timely response of all the tickets raised by the client end user Service requests solutioning by maintaining quality parameters Act as a custodian of clients network/ server/ system/ storage/ platform/ infrastructure and other equipments to keep track of each of their proper functioning and upkeep Keep a check on the number of tickets raised (dial home/ email/ chat/ IMS), ensuring right solutioning as per the defined resolution timeframe Perform root cause analysis of the tickets raised and create an action plan to resolve the problem to ensure right client satisfaction Provide an acceptance and immediate resolution to the high priority tickets/ service Installing and configuring software/ hardware requirements based on service requests 100% adherence to timeliness as per the priority of each issue, to manage client expectations and ensure zero escalations Provide application/ user access as per client requirements and requests to ensure timely solutioning Track all the tickets from acceptance to resolution stage as per the resolution time defined by the customer Maintain timely backup of important data/ logs and management resources to ensure the solution is of acceptable quality to maintain client satisfaction Coordinate with on-site team for complex problem resolution and ensure timely client servicing Review the log which Chat BOTS gather and ensure all the service requests/ issues are resolved in a timely manner Deliver NoPerformance ParameterMeasure1.100% adherence to SLA/ timelines Multiple cases of red time Zero customer escalation Client appreciation emails Mandatory Skills: Hadoop Admin.
Posted 2 months ago
7.0 - 12.0 years
9 - 14 Lacs
Hyderabad
Work from Office
We are seeking a skilled Hadoop/Cloudera Administrator to provide technical support for data integration and visualization platforms. The ideal candidate will also have exposure to Snowflake and AWS administration. Provide technical support to customers and internal teams for data integration and visualization platforms, primarily focused on Hadoop/Cloudera administration . Additional knowledge/experience in Snowflake and AWS administration is a plus. Investigate and troubleshoot software and system issues reported by users; perform root cause analysis and implement long-term solutions. Collaborate closely with development and QA teams to test and validate fixes and system enhancements. Debug application-level issues and provide effective resolutions or temporary workarounds as needed. Create and maintain comprehensive documentation for support processes, known issues, and resolution procedures. Maintain and update Standard Operating Procedures (SOPs) and Known Error Database (KEDB) with accurate and actionable information. Participate in problem management by identifying patterns in recurring incidents and driving root cause analysis and permanent fixes. Participate in on-call rotations to support critical production systems outside of standard business hours. Proactively monitor system performance and identify opportunities to enhance platform reliability, scalability, and user experience. Hadoop Administration, Aws, Cloudera (Hadoop
Posted 2 months ago
4.0 - 9.0 years
5 - 8 Lacs
Gurugram
Work from Office
RARR Technologies is looking for HADOOP ADMIN to join our dynamic team and embark on a rewarding career journey. Responsible for managing the day-to-day administrative tasks Provides support to employees, customers, and visitors Responsibilities:1 Manage incoming and outgoing mail, packages, and deliveries 2 Maintain office supplies and equipment, and ensure that they are in good working order 3 Coordinate scheduling and meetings, and make arrangements for travel and accommodations as needed 4 Greet and assist visitors, and answer and direct phone calls as needed Requirements:1 Experience in an administrative support role, with a track record of delivering high-quality work 2 Excellent organizational and time-management skills 3 Strong communication and interpersonal skills, with the ability to interact effectively with employees, customers, and visitors 4 Proficiency with Microsoft Office and other common office software, including email and calendar applications
Posted 2 months ago
8.0 - 13.0 years
22 - 37 Lacs
Pune
Hybrid
Role & responsibilities Role - Hadoop Admin + Automation Experience 8+ yrs Grade AVP Location - Pune Mandatory Skills : Hadoop Admin, Automation (Shell scripting/ any programming language Java/Python), Cloudera / AWS/Azure/GCP Good to have : DevOps tools Primary focus will be on candidates with Hadoop admin & Automation experience,
Posted 2 months ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : Unix Shell Scripting, Hadoop Administration, PySparkMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient and scalable application solutions.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide constructive feedback to team members.- Stay updated on industry trends and best practices to enhance application development processes.- Assist in troubleshooting and resolving application-related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Good To Have Skills: Experience with Unix Shell Scripting, Hadoop Administration, PySpark.- Strong understanding of ETL processes and data integration.- Experience in developing and optimizing data pipelines.- Knowledge of data warehousing concepts and methodologies.- Familiarity with database technologies and SQL queries. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
3 - 8 Lacs
Noida
Work from Office
We are hiring for the position "Hadoop Admin" Skill Set: Hadoop, Cloudera, big data, spark, Hive, HDFS, YARN, HIVE, KAFKA, SPARK, SQL DATABASE, RANGER Experience: 3 years Location: Noida, Sector-135 Work Mode: Work from Office Budget: 8 LPA
Posted 2 months ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
Looking for a Hadoop Administrator to manage, monitor, and optimize Hadoop clusters. Responsibilities include deployment, upgrades, performance tuning, and security. Requires 3+ years of experience with Hadoop ecosystem tools and Linux systems. Required Candidate profile Notice Period : Immediate or 30 days max
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City