Posted:8 hours ago|
Platform:
Work from Office
Full Time
Education Qualification : BE/B Tech Minimum Years of Experience : 7-10 Years Type of Employment : Permanent Requirement : Immediate or Max 15 days Job Description : Big Data Developer (Hadoop/Spark/Kafka) - This role is ideal for an experienced Big Data developer who is confident in taking complete ownership of the software development life cycle - from requirement gathering to final deployment. - The candidate will be responsible for engaging with stakeholders to understand the use cases, translating them into functional and technical specifications (FSD & TSD), and implementing scalable, efficient big data solutions. - A key part of this role involves working across multiple projects, coordinating with QA/support engineers for test case preparation, and ensuring deliverables meet high-quality standards. - Strong analytical skills are necessary for writing and validating SQL queries, along with developing optimized code for data processing workflows. - The ideal candidate should also be capable of writing unit tests and maintaining documentation to ensure code quality and maintainability. - The role requires hands-on experience with the Hadoop ecosystem, particularly Spark (including Spark Streaming), Hive, Kafka, and Shell scripting. - Experience with workflow schedulers like Airflow is a plus, and working knowledge of cloud platforms (AWS, Azure, GCP) is beneficial. - Familiarity with Agile methodologies will help in collaborating effectively in a fast-paced team environment. - Job scheduling and automation via shell scripts, and the ability to optimize performance and resource usage in a distributed system, are critical. - Prior experience in performance tuning and writing production-grade code will be valued. - The candidate must demonstrate strong communication skills to effectively coordinate with business users, developers, and testers, and to manage dependencies across teams. Key Skills Required : Must Have : - Hadoop, Spark (core & streaming), Hive, Kafka, Shell Scripting, SQL, TSD/FSD documentation. Good to Have : - Airflow, Scala, Cloud (AWS/Azure/GCP), Agile methodology. This role is both technically challenging and rewarding, offering the opportunity to work on large-scale, real-time data processing systems in a dynamic, agile environment.
Maimsd Technology
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
8.0 - 12.0 Lacs P.A.
8.0 - 12.0 Lacs P.A.
8.0 - 12.0 Lacs P.A.
Hyderabad, Pune
14.0 - 24.0 Lacs P.A.
13.0 - 18.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.
0.5 - 2.25 Lacs P.A.
Gurugram
9.5 - 18.0 Lacs P.A.
Kolkata, Pune
5.0 - 15.0 Lacs P.A.
Hyderābād
20.0 - 22.0 Lacs P.A.