Posted:1 day ago|
Platform:
Work from Office
Full Time
Job Description: Skill/ Tech Stack: Data Engineer Location: Bangalore Experience: 5 to 8 years Work from the office in a Hybrid mode (Thrice a week). Job Overview: The ideal candidate will: Work with the team to define high-level technical requirements and architecture for the back-end services , Data components, data monetization components Develop new application features and enhance existing ones Develop relevant documentation and diagrams. Work with other teams for deployment, testing, training, and production support. Integration with Data Engineering teams Ensure that development, coding, privacy, and security standards are adhered to Write clean and quality code. Ready to work on new technologies as business demands Strong communication skills and work ethics. Core/Must have skills: Out of total years of experience, minimum 5+ years of professional experience in Python development, with a focus on data-intensive applications. Proven experience with Apache Spark and PySpark for large-scale data processing. Solid understanding of SQL and experience working with relational databases (e.g., Oracle, sparkSQL) and query optimization. Experience in SDLC, particularly in applying software development best practices and methodologies. Experience in creating and maintaining unit tests, integration tests, and performance testing for data pipelines and systems. Experience with big data platform Databricks. Experience in building data intensive applications, data products and good understanding of data pipeline (Feature Data Engineering ,Data Transformation, Data Lineage, Data Quality) Experience with cloud platforms such as AWS for data infrastructure and services is preferred. This is a hands-on developer positions within a small elite development team that moves very fast Role will evolve as tech leadership for Data Initiative Good to have skills: Knowledge of FX business / capital market domain is a plus Knowledge of data formats like AVRO, Parquet, and working with complex data types. Experience with Apache Kafka for real-time data streaming and Kafka Streams for processing data streams. Experience with Airflow for orchestrating complex data workflows and pipelines. Expertise or interest in Linux Exposure to data governance and security best practices in data management.
Anlage Infotech
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowHyderabad
11.0 - 12.0 Lacs P.A.
Hyderabad
Experience: Not specified
12.0 - 13.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
Bengaluru
4.0 - 7.0 Lacs P.A.
Chennai
10.0 - 20.0 Lacs P.A.
Bengaluru
15.0 - 25.0 Lacs P.A.
6.0 - 16.0 Lacs P.A.