business.gov.au — Beta assessment

Assessment summary

User research for the Single Business Service business case raised concerns with the existing business.gov.au service. Customers:

  • “Don’t know what they don’t know”
  • have difficulty navigating the maze of programs and information
  • struggle to determine when/how/why information is relevant to them
  • expect service providers to proactively target information to them.

The new business.gov.au service aims to address these concerns by:

  • simplifying navigation and providing improved tools and cues to aid discovery
  • leveraging customers’ characteristics and behaviours to better personalise and target information
  • utilising analytics, insights and customer feedback to inform continual, rapid improvement.

How the Digital Service Standard applies to this assessment

The business.gov.au service is an existing information service and the redesign work assessed in this report began before the Digital Service Standard became live. This means when the service was assessed it was not in the scope of the Digital Service Standard and did not need to meet the Standard to continue development.

The Department of Industry, Innovation and Science voluntarily assessed the team’s redesign work against the Digital Service Standard in a pragmatic way. The assessment panel focused on how well the service redesign succeeded at:

  • meeting user needs
  • being simple, safe and secure
  • being able to be rapidly iterated.

Future substantial redesign work on business.gov.au is likely to be in the scope of the Digital Service Standard and will require full assessment.

Assessment report

Outcome of service assessment - Pass

The business.gov.au team has shown sufficient evidence of meeting critical criteria in the Digital Service Standard. The team should proceed with the beta development, continuing to address Standard criteria and taking account of the comments and recommendations outlined in this report.

Reasons

  • The team demonstrated a strong focus on user needs – showing the evolution of “finding grants and assistance” from a pain point to positive user feedback on the beta site.
  • The team demonstrated the ability to rapidly iterate the service, with a Scrum-based agile approach resulting in a public beta release every 4 weeks.
  • The team demonstrated how the chosen technologies are directly contributing to usability testing and rapid iteration.

Areas of good performance against the Standard

User needs, user-centred (Criterion 1 and 3)

Personas and journey maps constructed during initial Single Business Service user-centred design activity were validated by the team in early user workshops. These artefacts drove initial user story development and prioritisation.

The team provided a good example of their user-centred design (UCD) approach. Sketch concepts for finding business grants and assistance were “guerrilla tested” at various locations (for example, coffee shops). Interactive wireframes were then published online to broaden user testing. A refined concept was released to the public on beta.business.gov.au and promoted via multiple channels to encourage further user feedback. User feedback captured via beta.business.gov.au, coupled with the findings from the user research and testing activities, was then incorporated into the beta service backlog as input to sprint planning.

Whilst all members of the team play a role in UCD and user testing, the team includes members with specific UCD and user experience skills.

Over 400 users have contributed to user-centred design activities, either directly engaged or via indirect methods. Appropriate representation of the identified personas was evident.

Independent user testing to be conducted prior to go-live will use the functioning beta website to conduct in-person usability evaluations to confirm the work done by the team.

Agile (Criterion 3)

The team are following a Scrum-based agile methodology, with a sprint cadence of 2 weeks and a public beta release every 4 weeks. Continuous integration and deployment ensure that user testing can be conducted as soon as a user story has been developed.

Data, tools and systems (Criterion 4)

A cloud-based repository provides clear visibility of the service backlog (user stories), team velocity, upcoming sprint goals, test plans, test execution results and source code.

Products including achecker, BrowserStack and Clarity Grader are used to regularly validate service responsiveness, accessibility and usability (simplicity).

The team ably demonstrated a number of features of the delivery technology which directly address the service aims, notably:

  • content targeting to specific user personas and profiles
  • analytics to identify struggle patterns and preferred content variants.

Recommendations

  1. Analyse the beta.business.gov.au user base to determine broad usage by personas. This may aid expansion of user research with specific cohorts of users (Criterion 1).
  2. Involve subject matter experts to provide further quality assurance. For example, Departmental staff with relevant website expertise, such as accessibility (Criterion 5).
  3. Confirm that whole of government obligations (for example, AGLS minimum metadata, WCAG 2.0 requirements) are marked for inclusion in MVP (Criterion 4 and 5).
  4. Establish baseline data for completion times for user pathways. This will allow for better analysis and tracking of transaction costs and continual improvement (Criterion 6).
  5. Determine the critical metrics for the service and incorporate relevant automated analytic reports into the beta site, to underpin public performance reporting (Criteria 6).
  6. Seek opportunities to share the code repository, standards and data. DTO’s technical community is an appropriate channel for sharing. In particular, packaged solutions around assistance and events searching may be useful to other departments (Criterion 7 and 8).
  7. Work closely with other channels (for example, contact centre) to ensure information and user support is consistent (Criterion 12).

Assessment against the Digital Service Standard

Criterion Result
1 Pass
2 Pass
3 On track
4 Pass
5 On track
6 On track
7 On track
8 On track
9 Pass
10 On track
11 Pass
12 Not assessed
13 On track
14 Not assessed