Big Data/Hadoop Administrator

23 years

0 Lacs

Posted:2 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

At ClearTrail, work is more than ‘just a job’. Our calling is to develop solutions that empower those dedicated to keeping their people, places and communities safe. For over 23 years, law enforcement & federal agencies across the globe have trusted ClearTrail as their committed partner in safeguarding nations & enriching lives. We are envisioning the future of intelligence gathering by developing artificial intelligence and machine learning-based lawful interception & communication analytics solutions that solve the world’s most challenging problems.


Role- Big Data/Hadoop Administrator

Location – Indore, MP

Experience Required – 3 to 5 Years


What is your Role?

You will work in a multi-functional role with a combination of expertise in System and Hadoop administration. You will work in a team that often interacts with customers on various aspects related to technical support for deployed system. You will be deputed at customer premises to assist customers for issues related to System and Hadoop administration. You will Interact with QA and Engineering team to co-ordinate issue resolution within the promised SLA to customer.


What will you do?

  • Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem.
  • Installing Linux Operating System and Networking.
  • Writing Unix SHELL/Ansible Scripting for automation.
  • Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE etc.
  • Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time.
  • Maintaining HBASE Clusters and capacity planning.
  • Maintaining SOLR Cluster and capacity planning.
  • Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected.
  • Manage KVM Virtualization environment.


Must Have Skills -

  • Technical Domain: Linux administration, Hadoop Infrastructure and Administration, SOLR, Configuration Management (Ansible etc).
  • Linux Administration
  • Experience in Python and Shell Scripting
  • Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem
  • Knowledge of Hadoop core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK etc.
  • Knowledge of HBASE Clusters
  • Working knowledge of SOLR, Elastic Search


Good to Have Skills:

  • Experience in Networking Concepts
  • Experience in Virtualization technology, KVM, OLVM

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You