People Centred Connected Care — Alpha assessment

Delivering an easier and better way for people to manage their access to outpatient and community-based health services across the ACT.

Department / Agency
ACT Health
Date of Assessment
May 06, 2016
Assessment type
Digital Transformation Agency-led
Assessment stage
Alpha
Result of Assessment
Pass
Lead Assessor
Nathan Wall
Service Manager
Denise Lamb

Areas of good performance

  • The team has engaged a wide range of users in their research and utilised a varied range of research techniques.

  • Everyone on the team has been continuously involved in research activities and the whole team joins in playback sessions to ensure insights are shared as widely as possible.

  • The team recognised that a ‘typical’ agile team wouldn’t give them all the capabilities they needed so they embedded specialist resources that streamlined connections back into the rest of the Health portfolio. These specialists ensured privacy requirements were integrated into the team’s work as early as possible.

  • Instead of just researching and refining a single concept design, the team delivered multiple versions of several different prototypes, each that explored alternative ways of enabling their target users to complete tasks. Doing this enabled the team to identify and address areas of complexity, and to narrow down their focus to define their MVP.

Examples of prototype

Caption: Welcome screen of the ACT Health prototype

Caption: health check screenshot example

On the path to Beta

During the Alpha and the weekly in-flight check-ins, assessors provided recommendations that the team need to consider during the Beta stage.

  • There are some gaps in the demographic profile of users included in the research. The team has established a plan to address this and will target additional users with accessibility and other assisted digital needs in their Beta.

  • The design of plain English content is a significant challenge for the team. Further planning and research is needed in Beta to ensure there is time to iterate different variations of content and validate that users understand it.

  • Some thinking around how to scale the service has already started. This included the team developing a solid understanding of the number and variety of health services that could be included in the service. During their Beta, the team must continue to make decisions with wider scalability of the service in mind. Open sourcing their code and building reusable, responsive design patterns will support this.

  • The team needs to do more accessibility testing, and do it earlier. Applying accessible design patterns and regularly reviewing any likely issues as each feature is released. When prototyping in HTML it is possible to automate a range of tests. This helps reduce ‘basic’ issues like colour contrast failures, and embeds accessibility into the entire service early.

  • The team identified that their target audience had a very strong preference for using mobile devices and the early focus on a small-screen interface was very appropriate. The team needs to regularly check that the design patterns they are using scale up effectively to desktop screens. The challenge is working out how to build an interface that adapts and meets needs of users on a range of devices. The team didn’t produce any prototypes for this during their Alpha.

What we’ve learnt

Delivery team

  • Having a way to ensure the service and team members had achieved the desired quality and capability was welcomed. With everyone participating in the weekly check-ins, the team as a whole could discuss where the project was at, and understand where further work was needed.

  • The conversational style made it easy to learn what criteria in the Standard needed to be addressed. This will make the transition to ongoing self-assessments in the future much easier.

  • The Standard team were helpful, friendly and provided good advice on how to work through the criteria. The word ‘assessment’ felt like a test. The word ‘review’ was more positively received.

Assessment team

  • Meeting with the team weekly and discussing progress gave us the ability to develop a better understanding of the work the project was doing.

  • The weekly check-ins were a relaxed conversational way to ‘nudge’ the team in the right direction. Running the sessions in the team space surrounded by their artefacts was extremely valuable and helped assessors understand how the team was prioritising features. You can’t do better than ‘“seeing the thing’.

  • It was a delight to see the team modify their processes and ceremonies to find their rhythm, this also greatly helped them respond to the user needs they were discovering.

Digital Service Standard Scorecard

Assessment result for each criteria
CriteriaResult
1Pass
2Pass
3Pass
4On track
5On track
6On track
7On track
8On track
9On track
10On track
11On track
12On track
13On track