0 years
0 Lacs
Posted:1 day ago|
Platform:
On-site
Full Time
Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases.
Translate functional specifications and change requests into technical specifications
Translate business requirement document, functional specification, and technical specification to related coding
Develop efficient code with unit testing and code documentation
Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving
Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status
Manage, monitor, and ensure the security and privacy of data to satisfy business needs
Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English)
Co-ordinating with the UAT team
Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.)
Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.)
Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores
Knowledgeable in performance tuning and optimization
Experience in Data Profiling and Data validation
Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project
Knowledge in any BI tools will be an added advantage
Sound aptitude, outstanding logical reasoning, and analytical skills
Willingness to learn and take initiatives
Ability to adapt to fast-paced Agile environment
• Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake.
• Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs.
• Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting.
• Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance.
• Establish best DBT processes to improve performance, scalability, and reliability.
• Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures.
• Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP).
• Migrate legacy transformation code into modular DBT data models
ResourceTree Global Services Pvt Ltd
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowChennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed
Chennai, Tamil Nadu, India
Experience: Not specified
Salary: Not disclosed