- Hyderabad, India
-
Technology
-
In-Office
-
11047
Job Description
Job Purpose
The Property Data Engineer is responsible for developing and maintaining data conversion programs that transform raw property assessment data into standardized formats based on specifications by Property Data Analyst and Senior Analysts. This role requires not only advanced programming and ETL skills but also a deep understanding of the structure, nuances, and business context of assessment data. Even with clear and well-documented conversion instructions, engineers without prior exposure to this domain often face significant challenges in interpreting and transforming the data accurately. Data Engineer plays a critical role in ensuring the accuracy, efficiency and scalability of data processing pipelines that support the Assessor Operations.
Responsibilities
Depending on the specific team and role, the Property Data Engineer may be responsible for some or all the following tasks:
-
Develop and maintain data conversion programs using C#, Python, JavaScript, and SQL.
-
Implement ETL workflows using tools such as Pentaho Kettle, SSIS, and internal applications.
-
Collaborate with Analysts and Senior Analysts to interpret conversion instructions and translate them into executable code.
-
Troubleshoot and resolve issues identified during quality control reviews.
-
Recommend and implement automation strategies to improve data processing efficiency.
-
Perform quality checks on converted data and ensure alignment with business rules and standards.
-
Contribute to the development of internal tools and utilities to support data transformation tasks.
-
Maintain documentation for code, workflows, and processes to support team knowledge sharing.
Programming (Skill Level: Advanced to Expert)
-
Create and maintain conversion programs in SQL, Visual Studio using C#, Python or JavaScript.
-
Use JavaScript within Pentaho Kettle workflows and SSIS for data transformation.
-
Build and enhance in-house tools to support custom data processing needs.
-
Ensure code is modular, maintainable, and aligned with internal development standards.
-
Ensure code quality through peer reviews, testing and adherence to development standards.
ETL Execution (Skill Level: Advanced to Expert)
-
Execute and troubleshoot ETL processes using tools like Kettle, SSIS, and proprietary tools.
-
Input parameters, execute jobs, and perform quality checks on output files.
-
Troubleshoot ETL failures and optimize performance.
-
Recommend and implement automation strategies to improve data processing efficiency and accuracy.
Data File Manipulation (Skill Level: Advanced to Expert)
-
Work with a wide variety of file formats (CSV, Excel, TXT, XML, etc.) to prepare data for conversion.
-
Apply advanced techniques to clean, merge, and structure data.
-
Develop scripts and tools to automate repetitive data preparation tasks.
-
Ensure data is optimized for downstream ETL and analytical workflows.
Data Analysis (Skill Level: Supportive – Applied)
-
Leverage prior experience in data analysis to independently review and interpret source data when developing or refining conversion programs.
-
Analyze data structures, field patterns, and anomalies to improve the accuracy and efficiency of conversion logic.
-
Use SQL queries, Excel tools, and internal utilities to validate assumptions and enhance the clarity of analyst-provided instructions.
-
Collaborate with Analysts and Senior Analysts to clarify ambiguous requirements and suggest improvements based on technical feasibility and data behavior.
-
Conduct targeted research using public data sources (e.g., assessor websites) to resolve data inconsistencies or fill in missing context during development.
Quality Control (Skill Level: Engineer-Level)
-
Perform initial quality control on converted data outputs before formal review by Associates, Analysts, or Senior Analysts for formal review.
-
Validate that the program output aligns with conversion instructions and meets formatting and structural expectations.
-
Use standard scripts, ad-hoc SQL queries, and internal tools to identify and correct discrepancies in the data.
-
Address issues identified during downstream QC reviews by updating conversion logic or collaborating with analysts to refine requirements.
-
Ensure that all deliverables meet internal quality standards prior to release or further review.
Knowledge and Experience
-
Minimum Education: Bachelor’s degree in Computer Science, Information Systems, Software Engineering, Data Engineering, or a related technical field; or equivalent practical experience in software development or data engineering.
-
Preferred Education: Bachelor’s degree (as above) plus additional coursework or certifications in:
-
Data Engineering
-
ETL Development
-
Cloud Data Platforms (e.g., AWS, Azure, GCP)
-
SQL and Database Management
-
Programming (C#, Python, JavaScript)
-
4+ years of experience in software development, data engineering, or ETL pipeline development.
-
Expert-level proficiency in programming languages such as SQL, Visual Studio using C#, Python, and JavaScript.
-
Experience with ETL tools such as Pentaho Kettle, SSIS, or similar platforms.
-
Strong understanding of data structures, file formats (CSV, Excel, TXT, XML), and data transformation techniques.
-
Familiarity with relational databases and SQL for data querying and validation.
-
Ability to read and interpret technical documentation and conversion instructions.
-
Strong problem-solving skills and attention to detail.
-
Ability to work independently and collaboratively in a fast-paced environment.
-
Familiarity with property assessment, GIS, tax or public property records data.
Preferred Skills
-
Experience developing and maintaining data conversion programs in Visual Studio.
-
Experience with property assessment, GIS, tax or public records data.
-
Experience building internal tools or utilities to support data transformation workflows.
-
Knowledge of version control systems (e.g., Git, Jira) and agile development practices.
-
Exposure to cloud-based data platforms or services (e.g., Azure Data Factory, AWS Glue).
-
Ability to troubleshoot and optimize ETL performance and data quality.
-
Strong written and verbal communication skills for cross-functional collaboration.