Health Demand and Supply Utilisation Patterns Planning (HeaDS UPP) tool

The HeaDS UPP is an analysis tool used to examine various policy scenarios for redistribution of the health workforce based on distances that patients could be reasonably expected to travel for health services. The tool will provide a single, integrated, quality source of health workforce data which will improve the evidence used to inform policy development and program decisions. This report is for the Beta assessment of the HeaDS UPP tool.


The Health Demand and Supply Utilisation Patterns Planning (HeaDS UPP) tool is an initiative of the Australian Government to provide an interactive tool utilising various different sources of Medicare Benefits Schedule (MBS) data, GP training data and Royal Flying Doctor Service program data, and map them according to geographical regions (using the Australian Statistical Geographical Standard), including General Practitioner (GP) catchments areas.

There is currently no centralised location for this data and each individual organisation needs to request the data from each source to pull together the information in order to track statistics about the Health workforce around the country that are requested by internal Health teams and external departments or organisations.

The HeaDS UPP tool will use newly defined geographic catchments that reflect where people live and where they access health services together with data about where health practitioners are located and providing services.

The tool will enable collaboration across the health network and increase confidence in government spending.

Criterion 1: Understand user needs

The main user base consists of staff from the Department of Health and selective external organisations with a health workforce planning role, such as Rural Workforce Agencies, Primary Health Networks, Local Health Districts/Networks, the Royal Flying Doctor Service, medical colleges, state and territory departments and other Commonwealth departments.

The project team leveraged various opportunities to gain insight into user needs for the functionality and visual aspects of the tool over 18 months, including 3 user workshops with external users and 10 workshops at various stages with the users from the internal team who manage requests for information on a daily basis. The project team also conducted interviews through user forums, discussions, direct engagement with potential users and at conferences where the prototype was demonstrated. A user reference group for external to Health users was established.

Feedback from users informed enhancements to the prototype and revealed issues that were not considered in the initial development and enabled improvements made to the stability and security of the tool.

The results of user feedback formed the basis of the HeaDS UPP Functional and Non-functional requirement documents for the initial prototype and ongoing development of the tool.

Epics, Features, User Stories have been also developed and clearly documented for the HeaDS UPP tool.

The latest feedback received from users has been positive in relation to how the workforce planning prototype would assist their policy decision making.

Criterion 2: Have a multidisciplinary team

The team is collaborative and positive, with a good mix of experience and skills in the areas of project management, business analysis, application development, technical architecture, user interface and user experience, and testing. The product owner is fully engaged and empowered with decisions being made at the correct level resulting in the decision making processes being quick and efficient. Senior executive support is available.

The team has well defined processes for sharing knowledge and user research findings with existing and new team members. New starters get a walkthrough from project SMEs, peer reviews, sprint reviews and daily stand-ups. System design is documented and the team uses a range of collaboration tools such as Enterprise Architect, Octane and Word documents from TRIM.

There was a good amount of sharing and cross-skilling apparent, with the product and delivery managers inviting experts from the wider department to support the team’s project work, such as the Security Team and Testing Team.

The team is highly motivated, multi-disciplinary and performing very well in adhering to the Digital Service Standard. Team is proactive in identifying any potential skills gaps as the development progresses by using skill and experience matrix. The team is proficient in acquiring new resources and have well established process for on-boarding new team members.

Criterion 3: Agile and user-centric process

The project team developed their process based on wide consultation with different user groups and stakeholders.

The team works in an agile manner, using three (3) week sprints incorporating backlog-grooming, user story refinement, sprint-planning, daily stand-ups, sprint reviews, and sprint retrospectives, which enables constant improvement of their processes.

The team follows the agreed agile methodology and using agile tools including Kanban charts and Octane as an agile development, delivery and management tool.

The Minimum Viable Product (MVP) is clearly documented and being developed. A workshop and progressive review sessions with the UX designer resulted in a number of user screens being re-designed to improve usability and accessibility of the end product. This has now progressed to the current beta 'release candidate' stage.

Hypothesis statements and validating approaches have been captured as part of the confirmation process to ensure a user centric design can be delivered.

Next steps will include:

  • further refinement of the HeaDS UPP tool in line with user feedback
  • additional features to enhance the MVP before reaching Live status
  • additional testing before any new features are released.

Criterion 4: Understand tools and systems

Functional and non-functional requirements informed the solution architecture which was endorsed by the Health Architecture Governance Committee. The team is re-using and leveraging the existing infrastructure and middleware, such as Websphere, Oracle, Geoserver, ESRI and Splunk. The Health Data Portal AusKey authentication capability is used by users before they can access the tool.

HeaDS UPP team also engaged with the department’s Security and IT Services Branch to ensure products, services and licensing comply with Enterprise solutions, DTA, WOG and ISM standards

Criterion 5: Make it secure

The team has developed a project security strategy providing a framework for the delivery of a secure solution. External users of the HeaDS UPP tool must be authenticated using AusKey prior to gaining access to the tool. The Data Request Assessment Panel (DRAP) approved the use of datasets for beta Release 1a in January 2019. Information security measures for the HeaDS UPP tool will conform to Australian Government standards as provided by the Australian Security Manual.

In November 2018 Security accreditation process in accordance with Health’s IT Security Accreditation framework commenced. The team produced an IT Risk Register and Security Risk Management Plan to support an application for an Interim Authority to Operate (IATO) which is under consideration by Health’s IT Security Advisor (ITSA). It is expected that security accreditation of the HeaDS UPP Tool will be achieved by June 2019 and that the solution will be subject to a penetration test prior to its release.

A Privacy Impact Assessment (PIA) was completed late 2018.

Criterion 6: Consistent and responsive design

The UX/UI team specialist reviewed functionality and adapted designs to improve usability. Interface components are styled using the Australian Government Design System to deliver consistent customer experience. The map-based tool is optimised for compatibility with 1920x1080, 1366x768, 1440x900 and 1280x800 screen resolutions. It is not fully mobile responsive as the map-based tool is not intended for small screens and user research showed that user base is controlled due to the privacy requirements and that desktops will be used to interact with the system. However responsive design is used for the data presented in text and for navigational element.

Interface is simple and has a clear flow, providing the user with clarity on their journey. Branding is consistent with new beta design, creating a recognisable and consistent environment for the users.

Criterion 7: Use open standards and common platforms

The service is re-using and leveraging a technology stack used successfully in other areas of Health and other government departments. For example, AusKey is used for user authentication via the Health Data Portal, so that only authorised users can access the data. Where possible existing Government data sets have been used within the solution (e.g. Australian Statistical Geography Standard (2016 edition), Statistical Local Areas, Postal Areas).

During development, the project team actively adhered to open standards of development and common platforms where this delivers a secure and acceptable solution (e.g. Java Enterprise Edition (JEE) standards, JavaScript and Cascading Style Sheets (CSS) standards outlined by W3C).

Criterion 8: Make source code open

Project is reusing/sharing code and documentation from other projects within the Department of Health. 

The code has not been made public but is being stored in a bit-bucket repository and may be shared with other departments as required/requested. Noting that the data sets contain sensitive information which can only be shared with approved, trusted users.

Health IT Security has concerns that the code might reveal weaknesses allowing malicious actors to attack the system. This has been evidenced in recent security assessments in other systems

Criterion 9: Make it accessible

The service follows the standard Health governance process for ensuring adherence to WCAG 2.0 level AA in both development and design protocols. Layout designed to improve keyboard and visual experience. Interface incorporates usability and accessibility standards set by the Digital Service Standard by utilising the Australian Government Design System. Content written in plain language. The host website has been designed to work effectively with assistive technologies, including screen readers.

A variety of users have tested the end to end service, but accessibility testing has been performed by the internal accessibility specialists, some issues have been addressed and recommendations continue to be incorporated into the system. Further accessibility testing is planned prior to go live. Test scenarios can be provided to demonstrate end to end user journeys.

There was initial concern that the use of a geospatial data map would cause accessibility issues, however some improvements made to the initial prototype by improving colour contrast, adding search capability, textual data organised in accessible tables and the export function allows the data to be presented as text to improve accessibility and to be used by assistive technologies.

Criterion 10: Test the service

The team has a comprehensive multi-layered test strategy in place. It covers usability, accessibility, unit (developers), system, and user acceptance testing. The tool will be tested from the perspective of each of the user groups as identified. Because of the sensitivity of the data there is a strict process followed when allowing access to new user groups, including access controls that are different between internal and external users.

There is an end to end testing environment where code is promoted from Development to Test to Acceptance to Production with testing being conducted in each environment under formal change management procedures.

A brief private beta was conducted with internal Department users, prior to moving to a limited public beta with key external users. The quick progression to a limited public beta was due to external users being the main target audience for the tool.

Security testing will be undertaken via penetration and load testing planned before go live.

Criterion 11: Measure performance

The team has a number of performance measurement targets in progress.

As user access is limited due to the sensitivity of the data, user registrations are required and as such digital take up metrics can be easily tracked.

User satisfaction will be measured for the HeaDS UPP tool through user surveys and other feedback e.g. a pop-up 5 minute survey prior to a user sign-in. Audit log, digital take-up and completion rate will be measured using Google Analytics. If appropriate, Google Analytics will provide the raw transaction data to be used in calculating the cost per transaction.

There is not a Performance Dashboard for this service. With the provision of supplementary advice from DTA, Health will explore establishing a Performance Dashboard for this service.

Criterion 12: Don’t forget the non-digital experience

The no-digital experience is well catered for with the export functionality and future collaboration tools to be implemented into the service to allow users that can't access the tool directly to still get the required data much easier than they currently would. Current processes for requesting data will be leveraged.

The project team prepared data factsheets with key data that are available online in PDF version.

Criterion 13: Encourage everyone to use the digital service

The tool will be hosted on the Health Data portal and will have its own website to link into the tool. There has been no digital equivalent in existence before. Therefore, there is no baseline to reference. Digital take-up is almost guaranteed since only the digital space can provide the platform for the HeaDS UPP tool.

Significant resources are being allocated to the production of appropriate support materials so that the tool can be showcased and promoted at stakeholder forums and various conferences around the country. Examples include the Primary Health Network forum held in Canberra last November, and the upcoming National Rural Health Conference, Hobart, in March.

The prototype was popular with stakeholders. Take-up targets are being finalised, but estimated at 80% of all authorised targeted organisations during the first 12 months of release.

The assessor panel congratulates the service team on very good progress on, and adherence to the Digital Service Standard. The team is encouraged to:

  • continue ongoing user research, especially with diverse users, which will help inform improvements and the future direction of the service
  • perform additional user research for the remaining user cohorts as planned
  • continue to develop a skills and experience matrix for the team members
  • share their experiences and work practices across the department to build internal capability on implementing the Digital Service Standard.

Assessment against the Digital Service Standard

Criterion Result
1 Pass
2 Pass
3 Pass
4 Pass
5 Pass
6 Pass
7 Pass
8 Pass
9 Pass
10 Pass
11 Pass
12 Pass
13 Pass