Posted:5 days ago|
Platform:
Work from Office
Full Time
you'll be a senior contributor in our Data Engineering team, working across various projects to spot patterns in how we build our Snowflake Data Warehouse. you'll help us minimise our cloud costs, drive best practices across all our Data Disciplines and scale and automate our data governance. Enable VXBS (Visa Cross-Border Solutions) to Make Better Decisions, Faster Working in a multi-disciplinary data engineering team, you will: Support the building of robust data models downstream of backend services (mostly in Snowflake) that support internal reporting, financial and regulatory use cases. Focus on optimisation of our Data Warehouse, spotting opportunities to reduce complexity and cost. Help define and manage best practices for our Data Warehouse. This may include payload design of source data, logical data modelling, implementation, metadata, and testing standards. Set standards and ways of working with data across VXBS, working collaboratively with others to make it happen. Take established best practices and standards defined by the team and apply them within other areas of the business. Investigate and effectively work with colleagues from other disciplines to monitor and improve data quality within the warehouse. Contribute to prioritisation of data governance issues. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Basic Qualifications: 8+ years of relevant work experience with a Bachelor s Degree or at least 5 years of experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) or 2 years of work experience with a PhD, OR 11+ years of relevant work experience. You should Apply if You have experience and a passion for Data Modelling, ETL projects, and Big Data as a developer or engineer You have good experience in Python, Java or similar languages You have proven experience with AWS SQL and data modelling is second nature to you You are comfortable with general Data Warehousing concepts You strive for improvement in your work and that of others, proactively identifying issues and opportunities You have experience building robust and reliable data sets requiring a high level of control You have proven experience with stream technologies like Kafka, Kinesis, Pulsar, etc Preferred Qualifications: 9 or more years of relevant work experience with a Bachelor Degree or 7 or more relevant years of experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) or 3 or more years of experience with a PhD Any experience working within a finance function or knowledge of accounting. Experience working in a highly regulated environment (e.g. finance, gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Have previously used dbt, Databricks, or similar tooling Experience working with IaC tools such as Terraform, AWS CloudFormation, or Ansible Experience working with orchestration frameworks such as Airflow/Prefect Design and implementation knowledge of stream processing frameworks like Flink, Spark Streaming etc. Used to AGILE ways of working (Kanban, Scrum)
Visa
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowBengaluru
11.0 - 16.0 Lacs P.A.
Pune
5.0 - 15.0 Lacs P.A.
Hyderabad, Bengaluru
20.0 - 30.0 Lacs P.A.
Pune, Chennai, Bengaluru
7.0 - 17.0 Lacs P.A.
3.0 - 6.0 Lacs P.A.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
4.0 - 9.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.
7.0 - 12.0 Lacs P.A.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
5.0 - 10.0 Lacs P.A.
Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru
4.0 - 9.0 Lacs P.A.