Description:
This is a Data Platform Engineer role with one of the leading companies in AU right now -- Iress -- with an amazing team. They are continuing to grow rapidly. This is the chance to join right as the takes off.
More About the Role at Iress
As a Data Platform Engineer within our Data Foundations team, your primary responsibility will be to design, develop, and maintain robust data infrastructure and architecture to support teams across Iress producing and consuming data and building Data Products. You will play a crucial role in ensuring the availability, integrity, and efficiency of data for informed decision-making. The role requires a deep understanding of data systems, ETL processes, and collaboration with cross-functional teams to deliver actionable insights. ## Some of the awesome things you’ll be involved with: - Help manage and maintain the organisation's Databricks/AWS data platform to ensure its stability, reliability, and scalability. - Work closely with the Data Product Team to identify their technology needs and provide the right solutions. This includes selecting appropriate data storage and processing technologies and patterns that align with product requirements. - Contribute to best practices for data governance and maintain high standards of data quality and integrity across the platform. - Continuously monitor and optimise data processes to ensure high performance and scalability, accommodating growing data volumes and user demands. ## What you will bring: - High proficiency in best practice use and operation of Databricks & designing reusable solutions with engineering teams - High proficiency in using and maintaining AWS services as infrastructure as code that are re-usable and extensible - Strong communication skills with the ability to collaborate effectively with cross-functional teams. - Platform mindset to identify opportunities to build out the data foundation for use across the business - Practise of keeping up with developments in Databricks & AWS and identifying and communicating opportunities - High proficiency in Python for scripting and automation. - Strong proficiency in CI/CD DevOps approach to implement data solutions (experience with Terraform is desirable) - Strong proficiency in SQL, data modeling, and database management systems. - Excellent knowledge and understanding of AWS and other cloud technologies is desirable (Azure, GCP) - Experience with cloud data warehouse/lakehouse platforms (e.g.Databricks, Snowflake, Amazon Redshift, Google BigQuery) - Problem-solving mindset with the ability to troubleshoot and resolve issues efficiently. - Demonstrated commitment to staying updated on industry trends, emerging technologies, and best practices in data engineering - Able to work well under pressure and at speed, focused on delivering business value. - Experience with Jira and an agile working environment.
If you don’t think you're a perfect fit, you should still sign up to Hatch and create a profile, we'll match you to other roles that suit your profile.
Hatch exists to level the playing field for people as they discover a career that’s right for them. We model this in our hiring process for our partners like Iress.
✅ Applying here is the first step in the hiring process for this role at Iress.
We do not discriminate on the basis of gender identity, sexual orientation, cultural identity, disability, age, or any other non-merit factors. To put it simply, Hatch is for everyone.