A prominent provider of comprehensive services on facilities across ANZ.
Your new role
You will join a high-impact project implementing a new data lakehouse environment. Involved in creation, management and maintenance of data pipelines, data warehousing and data quality analysis and interpretation in accordance with business requirements.
In this role you will,
Responsible for design and oversee the implementation of end-to-end data architecture solutions
Work with Azure Databricks for large-scale data processing.
Assist in the development and implementation of robust governance to ensure accurate database automation and interpretations. This includes policy and procedure development; regular audits; data error management and internal data control processes; data extraction and transformation for reporting.
Have strong understanding of data modelling and data warehousing
Manage and optimize database performance and monitoring data quality to ensure accuracy and consistency.
Proficiency in programming languages such as Python & SQL
Have strong understanding of data governance, data quality, and data security best practices
Maintain Power BI tenant and gateway.
Drive a high performing engaged team culture with peers and project team members.
What you'll need to succeed
Have minimum of 5 years experience in Data Engineering.
Advanced SQL skills for query optimisation and performance tuning in Databricks Notebooks.
Hands-on experience with Azure Databricks for large-scale data processing.
Proficient in Azure Data Factory, Databricks, SQL Server Integration Services (SSIS), and SQL Server Analysis Services (SSAS)
Expertise in Azure Data Factory for orchestrating and automating data workflows.
Experience with Azure DevOps, including setting up CI/CD pipelines.
Strong proficiency in Python for data transformation, automation, and pipeline development.
Knowledge of Medallion architecture for structuring data lakes with bronze, silver, and gold layers.
Familiarity with data modelling best practices for analytics and business intelligence.
Demonstrated experience in application and infrastructure automation/orchestration.
Solid understanding of cloud security principles and practices.
Familiarity with CI/CD processes and best practices, contributing to a robust DevOps culture.
Understanding of cloud infrastructure networking concepts, with hands-on experience in implementation and support.
Exposure to Databricks, Pyspark and Azure services is highly desirable.
Certifications in Azure Engineering or Databricks is desirable
What you'll get in return
Working in an esteemed organisation where they develop and grow careers.
A long-term opportunity with flexible work arrangements. Do not miss out!
What you need to do now
If you're interested in this role or know of someone that may be, please click ‘Apply Now’ or forward an updated CV to Gopalakrishnan.subramanian@hays.com.au