Starting a business — Alpha assessment

This new service will reduce the confusion for users about whether or not they are “in business” and assist them to comply with their obligations. It will provide more certainty to people, helping them discover if they meet the government’s definition of “being in business”. Then - regardless of their status - the wizard guides the user through the transition of starting a business, or complying with government compliance requirements as a sole trader.

Assessment report

The ‘Starting a Business’ team are looking to progress from alpha stage of the service design and delivery process to beta.

Outcome of service assessment - Pass

The project has shown sufficient progress and evidence of meeting the first 3 criteria of the Digital Service Standard. The team are on track for other criteria and they should proceed to beta stage, taking account of the recommendations outlined in this report.

The service meets the required elements of the Digital Service Standard at the alpha stage. The assessment panel noted the entire team has been involved in user research, and the team has spent a significant amount of time, and used a number of appropriate approaches and techniques, to get a deep understanding of user needs. Team processes, such as weekly project reviews, have helped the project retain its agility and manage external stakeholders and dependencies.

The panel recommends that the team spend time in beta ensuring they better understand the broader context of their service in the user experience. For example, the team needs to ensure that users of the starting a business wizard can be successfully handed off to the next steps in the process, and that users can complete the entire end-to-end transaction. Up to the conclusion of alpha, the scope of the project has been constrained by a wider program of work in the Department of Industry, Innovation and Science (DIIS). It is vital the insights gained in this project continue to be shared with other projects in the DIIS program, to ensure integration and a seamless end-to-end user experience.

The panel recommends the team consider the aspirational goals of the product, and resolve the broader vision and purpose of the service. These considerations will play an important part in determining a backlog of work for the product once it has gone beyond beta, and will inform the composition of the team required to execute the vision.


User needs research (Criterion 1)

  • The team demonstrated that they have conducted research with a broad range of users throughout discovery and alpha, using several research techniques, including in-depth interviews and workshops.
  • The team showed how the user needs they identified during discovery had informed their prototypes during alpha.
  • They have developed a solid understanding of user needs and pain points. These have been documented in a detailed and highly visual way - the “empathy wall” and mapping of user demographics were particularly impressive.

Multidisciplinary team (Criterion 2)

  • The team was comprised largely of departmental staff, and specific steps have been taken to ensure expertise from specialist resources was shared across the team.
  • The panel notes that churn in the project is always a challenge - in an ideal world, key resources that were being ‘swapped out’ would have had time to handover, but ensuring shadowing within the team reduced the impact of this.
  • Project review process compensated for external dependencies impacting the project - internal support from their home agencies was strong.

Service design and delivery process (Criterion 3)

  • Scope of the solution created has been constrained by the broader program of work - this has led the project to focus on creating a content solution rather than a fully transactional service. The user need and pain point caused from not currently having this information clearly presented is still very clear.
  • The team appears to have taken a narrow focus early during discovery, and perhaps has missed some other pain points or problems in the starting a business space. The narrow focus has possibly limited the team’s ability to understand the true context of the user experience - the ‘wizard’ is great - and users say that it’s useful - but can they actually do the next step?
  • The use of a decision register to trace the flow of decisions was a useful artefact, especially since the team created a number of radically different prototypes quickly, rather than a gradual iteration over the entire phase of the project.

The assessment panel makes the following recommendations.

User needs research (Criterion 1)

  • The panel notes that the team attempted to include a wide and diverse demographic of users - and that a number of planned sessions did not take place. There are some clear gaps in the profile of users included in user research to date. In beta, the team should ensure they speak to users who have:
    • various types and degrees of disability
    • culturally diverse backgrounds, including Indigenous users
    • lower levels of tech savviness
    • lower levels of ‘business readiness’ or awareness
    • lower literacy levels
  • The panel also encourages the team to:
    • use a broader variety of research recruitment sources. AirTasker proved successful at finding a specific ‘type’ of user during alpha, but over-use could introduce an unintended bias
    • not become dependant on secondary research or on the activities of external projects
    • ensure user needs truly drive the detailed design of the service - in this case making the content as clear and simple as possible is critical

Multidisciplinary team (Criterion 2)

  • In beta, the team should develop a plan early on for managing the governance of the product once it’s ‘Live’ and staff return to their various home agencies and roles.
  • The team should consider having a legal/policy expert embedded in the project, to accelerate decisions around content.
  • The design of content for this project will remain the most challenging task. Adding extra content design resources to work with designers, researchers and subject matter experts will enable further refinements.

Service design and delivery process (Criterion 3)

  • Articulate the broader vision more clearly about how this service will integrate from an experience point of view.
  • Explore each concept within the broader user context. During beta the team should focus on the end-to-end experience for users who need to proceed to formally set up a business, to ensure they can successfully continue through the rest of the overall process. Look at how this tool will handoff to (or integrate with) the next part of the process, and how a user might get back to the wizard. What’s their overall experience?
  • Design changes being done on a regular basis could be more iterative than revolutionary.
  • The panel encourages the team to document user stories, rather than assumptions, to articulate and bring more focus to user needs during beta.

Open standards and open source (Criterion 7 and 8)

  • The panel noted the team is already thinking about how they can open source aspects of their code, even though the technology stack is not an open source product. The panel strongly encourages the team to pursue this.

Accessibility and assisted digital (Criterion 10)

  • Consider the format of the information being presented to users. The ability to export the results of the wizard in PDF format is not particularly mobile friendly. A variety of other techniques and approaches already exist.
  • Some accessibility testing, even limited testing, would help flag possible problem spots. The team could have made better use of inclusivity expertise and advice available during discovery and alpha. A plan for conducting accessibility testing in beta would have been particularly useful to complete in alpha.
  • Beta must include research with more diverse users, users with specific disabilities, users with lower levels of technical literacy, and users from a variety of cultural and non-English speaking backgrounds.

Areas of good performance against the Standard

User research

The diverse research techniques, creativity in user recruitment, and visual presentation of the demographics they interacted with was excellent.

The decision register, which documents assumptions tested during alpha, was a great way of enabling traceability. It could be further improved by documenting validated assumptions as user needs.

It was fantastic to see everyone in the team involved in research, and the team’s “empathy wall” was an excellent artefact to keep insights front-of-mind throughout the alpha phase.

Collaboration across government

The ability of the team to work across the APS, and map out the wider landscape, ensured they could identify and manage the right stakeholders, and identify effective collaboration opportunities.

Use of technology to support a geographically distributed team

Having a person chaperone meetings to support remote team members - such a small thing, but such a big impact - was a significant factor in how well the team communicates internally within the project.

Upskilling in user research across the team ensured they leveraged expertise available. It wasn’t possible for the user research ‘expert’ to conduct all research sessions, so the team established specific processes to grow and support that capability across the wider project. Effective use of tools like Google Hangouts ensured the team stayed connected to the research.

Assessment against the Digital Service Standard

Criterion Result
1 Pass
2 Pass
3 Pass
4 On track
5 On track
6 On track
7 On track
8 On track
9 On track
10 On track
11 Not assessed
12 Not assessed
13 Not assessed
14 Not assessed