Posted:2 weeks ago|
Platform:
Hybrid
Full Time
As a Senior Data Engineer and database specialist you will be designing, creating and managing
the cloud databases and data pipelines that underpin our decoupled cloud architecture and APIfirst approach. You have proven expertise in database design, data ingestion, transformation, datawriting, scheduling and query management within a cloud environment. You will have proven experience and expertise in working with AWS Cloud InfrastructureEngineers, Software/API Developers and Architects to design, develop, deploy and operate dataservices and solutions that underpin a cloud ecosystem. You will take ownership andaccountability of functional and non-functional design and work within a team of Engineers tocreate innovative solutions that unlock value and modernise technology designs. You will role model continuous improvement mindset in the team, and in your projectinteractions, by taking technical ownership of key assets, including roadmaps and technicaldirection of data services running on our AWS environments. See yourself in our team The Business Banking Technology Domain works in an Agile methodology with our businessbanking business to plan, prioritise and deliver on high value technology objectives with keyresults that meet our regulatory obligations and protect the community.
You will work within the VRM Crew that is working on initiatives such as Gen AI based cash
flow coach to provide relevant data to our regulators. To achieve our objectives, you will use you deep understanding of data modelling and dataquality and your extensive experience with SQL to access relational databases such as Oracleand Postgres to identify, transform and validate data required for complex business reportingrequirements. You will use your experience in designing and building reliable and efficient data pipelinespreferably using modern cloud services on AWS such as S3, Lambda, Redshift, Glue, etc toprocess large volumes of data efficiently. Experience with data centric frameworks such as Spark with programming knowledge in Scalaor Python is highly advantageous. As is experience working on Linux with shell and automationframeworks to manage code and infrastructure in a well-structured and reliable manner. Experience with Pega workflow software as a source or target for data integration is also highlyregarded.
Were interested in hearing from people who:
• Can design and implement databases for data integration in the enterprise• Can performance tune applications from a database code and design perspective• Can automate data ingestion and transformation processes using scheduling tools.Monitor and troubleshoot data pipelines to ensure reliability and performance.• Have experience working through performance and scaling through horizontal scalingdesigns vs database tuning• Can design application logical database requirements and implement physical solutions• Can collaborate with business and technical teams in order to design and build criticaldatabases and data pipelines• Can advise business owners on strategic database direction and application solutiondesign Tech skills
We use a broad range of tools, languages, and frameworks. We dont expect you to know them
all but having significant experience and exposure with some of these (or equivalents) will setyou up for success in this team.• AWS Data products such as AWS Glue and AWS EMR• Oracle and AWS Aurora RDS such as PostgreSQL• AWS S3 ingestion, transformation and writing to databases• Proficiency in programming languages like Python, Scala or Java for developing dataingestion and transformation scripts.• Strong knowledge of SQL for writing, optimizing, and debugging queries.• Familiarity with database design, indexing, and normalization principles.Understanding of data formats (JSON, CSV, XML) and techniques for convertingbetween them. Ability to handle data validation, cleaning, and transformation.• Proficiency in automation tools and scripting (e.g., bash scripting, cron jobs) forscheduling and monitoring data processes.• Experience with version control systems (e.g., Git) for managing code and collaboration. Working with us: Whether youre passionate about customer service, driven by data, or called by creativity, acareer with CommBank is for you. Our people bring their diverse backgrounds and unique perspectives to build a respectful,inclusive, and flexible workplace with flexible work locations. One where were driven by ourvalues, and supported to share ideas, initiatives, and energy. One where making a positive impactfor customers, communities and each other is part of our every day. Here, youll thrive. You’ll be supported when faced with challenges and empowered to tacklenew opportunities. We’re hiring engineers from across all of Australia and have openedtechnology hubs in Melbourne and Perth. We really love working here, and we think you willtoo. We support our people with the flexibility to balance where work is done with at least half theirtime each month connecting in office. We also have many other flexible working options
available including changing start and finish times, part-time arrangements and job share to name
a few. Talk to us about how these arrangements might work in the role you’re interested in. If this sounds like the role for you then we would love to hear from you. Apply today!
Think People Solutions
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Java coding challenges to boost your skills
Start Practicing Java Now35.0 - 40.0 Lacs P.A.
Bengaluru
5.0 - 9.0 Lacs P.A.
Chennai
25.0 - 30.0 Lacs P.A.
14.0 - 24.0 Lacs P.A.
4.0 - 8.125 Lacs P.A.
Gurgaon
4.0 - 6.1925 Lacs P.A.
Bengaluru
35.0 - 40.0 Lacs P.A.
pune, maharashtra
Salary: Not disclosed
hyderabad, telangana
Salary: Not disclosed
New Delhi, Pune, Delhi / NCR
0.5 - 3.0 Lacs P.A.