Home
Jobs

Posted:1 month ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Summary: We are looking for an experienced Big Data Administrator with strong Linux and AWS infrastructure experience to join our growing data team. The ideal candidate will have deep knowledge of Big Data platforms, hands-on experience managing large clusters in production, and a solid foundation in scripting and automation. You will play a crucial role in maintaining, optimizing, and scaling our data infrastructure to meet business needs. Must-Have Skills: Strong understanding of Linux OS , networking, and security fundamentals. Proven experience with AWS Cloud Platform – infrastructure, services, and architecture. Expertise with Infrastructure as Code (IaC) tools such as Terraform or Ansible . Hands-on experience managing large Big Data clusters (at least one of: Cloudera, Hortonworks, EMR ). Strong experience in observability for Big Data platforms using tools like: Prometheus InfluxDB Dynatrace Grafana Splunk Expert-level understanding of: Hadoop Distributed File System (HDFS) Hadoop YARN Familiarity with Hadoop file formats: ORC, Parquet, Avro , etc. Deep knowledge of compute engines such as: Hive (Tez, LLAP) Presto Apache Spark Ability to interpret query plans and optimize performance for complex SQL queries on Hive and Spark . Experience supporting Spark with Python (PySpark) and R (SparklyR, SparkR) . Strong scripting skills – at least one of: Shell scripting Python Experience collaborating with Data Analysts and Data Scientists and supporting tools like: SAS R-Studio JupyterHub H2O Ability to read and understand code written in Java, Python, R, Scala . Nice-to-Have Skills: Experience with workflow orchestration tools such as Apache Airflow or Oozie . Familiarity with analytical libraries like Pandas, NumPy, SciPy, PyTorch , etc. Experience with DevOps tools like Packer , Chef , or Jenkins . Knowledge of Active Directory , Windows OS , and VDI platforms such as Citrix or AWS Workspaces . Key Responsibilities: Administer and manage Big Data infrastructure in a hybrid cloud environment. Ensure high availability, scalability, and security of Big Data platforms. Collaborate with DevOps, Data Engineering, and Data Science teams to support data initiatives. Monitor, troubleshoot, and optimize platform performance. Automate routine tasks using scripts and IaC tools. Provide support and guidance for analytical tools and platforms. Participate in capacity planning and architectural reviews. Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Data Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Noida, Uttar Pradesh, India

Noida, Uttar Pradesh, India