Job
Description
We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in! REQUIREMENTS: Total experience 3+years. Hands on working experience in big data technologies, particularly in the context of Azure Databricks, which may include Apache Spark for distributed data processing. In-depth knowledge of Azure Databricks for data engineering tasks, including data transformations, ETL processes, and job scheduling. Strong SQL skills for data manipulation and retrieval, along with the ability to optimize queries for performance in Snowflake. Expertise in designing and implementing ETL processes to move and transform data between systems, utilizing tools and frameworks available in Azure Databricks. Experience with integrating diverse data sources into a cohesive and usable format, ensuring data quality and integrity. Knowledge of programming languages like Python and PySpark for scripting and extending the functionality of Azure Databricks notebooks. Familiarity with version control systems, such as Git, for managing code and configurations in a collaborative environment. Ability to monitor data pipelines, identify bottlenecks, and optimize performance for both Azure Data Factory. Experience with AWS or Azure, specifically in building data pipelines, is needed. Experience with Azure Data Factory and AWS Glue for orchestrating data pipelines is necessary. Competency with relational databases such as PostgreSQL and MySQL. Knowledge of BI tools such as Tableau and Power BI is required. Proficiency with Git, including branching, merging, and pull requests, is required. Experience in implementing continuous integration and delivery for data workflows using tools like Azure DevOps is essential. RESPONSIBILITIES: Writing and reviewing great quality code Understanding the clients business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the clients requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design documents explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements