Special Access Scheme — Alpha assessment

The Special Access Scheme (SAS) is a regulatory framework that enables health practitioners to access unapproved therapeutic goods.

Most therapeutic goods are required to undergo an evaluation for quality, safety and efficacy and be included on the Australian Register of Therapeutic Goods (ARTG) before they can be supplied in Australia. In recognition that there are circumstances where patients need access to therapeutic goods that are not included on the ARTG, the TGA manages the SAS and other access pathways.

The Special Access Scheme provides for the import and/or supply of an unapproved therapeutic good for a single patient, on a case by case basis. There are three SAS pathways in Australia and it is the treating health practitioner’s decision as to which pathway should be used based on a number of factors.

The TGA is introducing an online system to assist Australian registered health practitioners in utilising the SAS. The online system will streamline business practices for health practitioners, supporting timely patient care and reduce burdensome paper-based processes.

Prior to redeveloping the service in an electronic environment, health practitioners reported they were confused about the process, in particular which pathway should be utilised to obtain unapproved goods. This often caused delays in application processing times and high volumes or rework. The new service addresses these pain points through digitising and simplifying the application process for health practitioners.

Following changes in legislation and associated regulations to improve accessibility and timeliness to unapproved therapeutic goods for health practitioners, the TGA began applying the Digital Service Standard to better understand and meet the needs of their users through an improved digital solution.

Assessment detail

Criterion 1: Understand user needs

The service team demonstrated a strong willingness to engage and learn about their users’ needs. User research interviews were held across eight hospitals with doctors, pharmacists and hospital procurement teams along with TGA staff. As prototypes were developed, the team undertook usability testing, with the results informing two rounds of iterations to the product design.

Though not all members of the team were able to attend interviews in person, the results were synthesised in a team setting, providing opportunities for all team members to build empathy and understanding of their users. Insights were categorised into major pain points and have informed decision making on product improvements.

Though the team have performed well under this Criterion, they are encouraged to undertake further research to consider the experiences of a wider range of user groups, including users with diverse needs. This could be achieved through further interviews with users in rural or remote locations, those of diverse cultural or linguistic backgrounds and those with accessibility needs.

Ongoing usability testing and research will help the team continue to perform well in this area as they move into beta. The team is committed to ongoing user research and engagement throughout further releases. A number of interstate trips have been organised to demonstrate the system and seek advice from users to how the TGA may best support uptake.

Criterion 2: Have a multidisciplinary team

The team is collaborative and positive, with a good mix of specialists and digital professionals. With a number of the team having backgrounds similar to their users (e.g. medical professionals and pharmacists), they will need to continue to be mindful not to introduce bias to the research or design of solutions. The creation of a team charter and supportive management appear to have had a positive impact on forming a successful multidisciplinary team.

The product manager is empowered, and decisions are made appropriately at the team level. Senior executive support is available yet unobtrusive, this governance approach is commendable.

There was a good amount of sharing and cross-skilling apparent, with the product and delivery managers inviting experts from the wider department to support team work, for example in areas of user research, accessibility and enterprise architecture. The team is encouraged to explore the skills and roles needed for beta, and scale the team to meet the needs of the service.

Criterion 3: Agile and user-centred process

The team is effectively using agile (Scrum) processes and are aligning work appropriately under the Service Design and Delivery Process.

The team is encouraged to acknowledge failure as a opportunity for learning; only two iterations of the prototypes were shared so it was unclear if other iterations were developed and discarded. However, the team did share extensive documentation on issues identified during user research and solutions planned or executed during alpha, so it is clear they are taking a user-centred design approach.

Criterion 4: Understand tools and systems

The team has undertaken research and planning into technology decisions and is engaging with the enterprise architecture team.

Criterion 9: Make it accessible

Content design of the prototype has been considered, with some simplification to the content already completed (e.g. removal of departmental category identifiers). Further usability testing with diverse audiences may provide further improvements in this space.

Criterion 11: Measure performance

The team has worked collaboratively across their department, for instance with the call centre support team, invested business areas, and are using data to measure performance improvements. This includes metrics like numbers and types of enquiries and application processing times, which will support calculations into costs per transaction for both internal and external stakeholders.

The assessor panel congratulates the service team on their good performance to date and looks forward to seeing this work progress into beta.

  • Ongoing user research, especially with diverse users will help inform improvements and the future direction of the service.
  • The team is encouraged to share their experiences and work practices across their department to build internal capability on implementing the Digital Service Standard.

Assessment against the Digital Service Standard

Criterion Result
1 Pass
2 Pass
3 Pass
4 On track
5 On track
6 On track
7 On track
8 On track
9 On track
10 On track
11 On track
12 On track
13 On track