Impetus Technologies is seeking a skilled Senior Engineer with expertise in Java and Big Data technologies. As a Senior Engineer, you will be responsible for designing, developing, and deploying scalable data processing applications using Java and Big Data frameworks. Your role will involve collaborating with cross-functional teams to gather requirements, developing high-quality code, and optimizing data processing workflows. You will also mentor junior engineers and contribute to architectural decisions to enhance the performance and scalability of our systems.
Key Responsibilities:
- Design, develop, and maintain high-performance applications using Java and Big Data technologies. - Implement data ingestion and processing workflows utilizing frameworks like Hadoop and Spark. - Collaborate with the data architecture team to define data models and ensure efficient data storage and retrieval. - Optimize existing applications for performance, scalability, and reliability. - Mentor and guide junior engineers, providing technical leadership and fostering a culture of continuous improvement. - Participate in code reviews and ensure best practices for coding, testing, and documentation are followed. - Stay current with technology trends in Java and Big Data, and evaluate new tools and methodologies to enhance system capabilities. Skills and Tools Required: - Strong proficiency in Java programming language with experience in building complex applications. - Hands-on experience with Big Data technologies such as Apache Hadoop, Apache Spark, and Apache Kafka. - Understanding of distributed computing concepts and technologies. - Experience with data processing frameworks and libraries, including MapReduce and Spark SQL. - Familiarity with database systems such as HDFS, NoSQL databases (like Cassandra or MongoDB), and SQL databases. - Strong problem-solving skills and the ability to troubleshoot complex issues. - Knowledge of version control systems like Git, and familiarity with CI/CD pipelines. - Excellent communication and teamwork skills to collaborate effectively with peers and stakeholders. - A bachelors or masters degree in Computer Science, Engineering, or a related field is preferred. About the Role: - You will be responsible for designing and developing scalable Java applications to handle Big Data processing. - Your role will involve collaborating with cross-functional teams to implement innovative solutions that align with business objectives. - You will also play a key role in ensuring code quality and performance through best practices and testing methodologies. About the Team: - You will work with a diverse team of skilled engineers, data scientists, and product managers who are passionate about technology and innovation. - The team fosters a collaborative environment where knowledge sharing and continuous learning are encouraged. - Regular brainstorming sessions and technical workshops will provide opportunities to enhance your skills and stay updated with industry trends.
You are Responsible for:
- Developing and maintaining high-performance Java applications that process large volumes of data efficiently. - Implementing data integration and processing frameworks using Big Data technologies such as Hadoop and Spark. - Troubleshooting and optimizing existing systems to improve performance and scalability. To succeed in this role - you should have the following:
- Strong proficiency in Java and experience with Big Data technologies and frameworks. - Solid understanding of data structures, algorithms, and software design principles. - Excellent problem-solving skills and the ability to work independently as we'll as part of a team. - Familiarity with cloud platforms and distributed computing concepts is a plus.