Starting a business — Beta assessment

The Hobby or Business tool is for people who are getting money or intend to get money from creative works, and are unsure if they are a hobby or a business. The tool guides them on their government obligations.

Department / Agency
Department of Industry, Innovation and Science
Date of Assessment
26 May 2016
Assessment type
Digital Transformation Agency-led
Assessment stage
Beta
Result of Assessment
Pass
Lead Assessor
Nathan Wall
Service Manager
Clive Rossiter

Business or Hobby Tool

Based on the information provided by the team throughout the inflight assessment the starting a business service has met the criteria for the Beta assessment.

Areas of good performance

The team had some early difficulties but managed to narrow their focus for their minimum viable product to build a service that helps people who earn money from things they’ve created (creatives, artists, and makers) understand whether they need to report income generated from their activity, and if they should consider formalising their hobby to become a business.

Insights from user research clearly demonstrated the common pain points. The team worked very well with stakeholders, particularly the Australian Taxation Office, to reach a shared understanding and a common approach for providing guidance and clarity, working towards ‘certainty’ as the ideal outcome from a user perspective.

The team diversified their research recruitment sources, using ‘maker-spaces’, hubs and other networks to reach a broad range of ‘makers’. The team were able to demonstrate how they have iterated the questions in the ‘Hobby or Business?’ tool based on user feedback. The team had to use further initiative to find users in the maker cohort who also had accessibility needs.

A specialist tax resource joined the team and made a considerable positive impact, removing a number of external dependencies. This increased the pace at which the team could make decisions. Removing external dependencies from work streams always helps maintain momentum.

Later rounds of user research confirmed that early iterations of the Beta product didn’t quite meet user needs. Users said that the language didn’t sound ‘governmenty’ and the simple language was easier to understand, however, many users indicated that the information didn’t provide them enough certainty about their circumstances.

The team also integrated their Beta product into existing non-digital channels really well. They included the business.gov.au contact centre in their overall research. End-to-end testing was also conducted which directed users through the non-digital channel. As a result, contact centre scripts were adapted to ensure users could get the same ‘answer’ from government, regardless of the channel they used.

On the path to Live

The Alpha assessment identified some gaps in the user demographic which had been included in the research project. To a large extent, these gaps were closed during the development of the Beta, but not completely.

Accessibility testing was left quite late in the schedule. Having to find niche users with a wide range of assisted digital needs added to the challenge.

Now that the project has returned to its home agency, the team need to continue to work closely with stakeholders. The content duplication and fragmentation that currently exists across other information sources needs to be removed, to prove they can seamlessly ‘hand off’ users to the relevant services further along their journey.

The ongoing challenge for the team is how to improve this endpoint and still retain the agreement of their stakeholders. The team has been working well with agencies such as ATO.

Having a tax specialist embedded on the project has already seen significant positive improvement to the cross-agency communication and collaboration.

The team must also continue to work within complex legislative and policy constraints that will take some time to overcome.

What we’ve learnt

Delivery team

  • Involving the entire team in user research is essential for building empathy with users.

  • Involving stakeholders in the user research process is essential for stakeholder support and trust of the project team.

  • Investing time to build trust with stakeholders provides the team with an environment where it’s safe to ‘fail’.

  • Recruitment of users to engage in research can be time consuming and should be started as early as possible in the project (ideally, before the project even formally begins).

  • Access needs of users are varied (such as dyslexia, autism, mobility issues, language issues) and this needs to be kept in mind throughout the process to ensure our solution addresses the range of needs.

  • The cross agency team was a challenge at times with differing stakeholders and processes but was essential to addressing our user needs.

  • A strong support structure for the project team is essential to being able to deliver in the 20 week timeframe.

  • Adjusting the balance of skills within the agile team is sometimes necessary and extra effort should be taken to ensure new members are inducted swiftly into the team.

  • Involve your non-digital channels in usability testing.

  • Re-using existing capabilities and ICT platforms allows for a smoother transition back into the Department.

Assessment team

  • When running in-flight check-ins it’s essential to have as many of the team present as possible.

  • If possible attend some of the team’s agile ceremonies during the course of the sprint; you get valuable insights into wider issues the team is managing.

  • The conversation is important, but artefacts are the only evidence you have to know for sure that the project is working well, and the service standard is likely to be met. Always ask to ‘see the thing’.

  • The conversation and planning around how to open source the code took a long time. Recognising the mind-shift this criterion is asking agencies to make, it would have been advantageous to start the conversation earlier. The user pathway / decision tree flow the project has built could be something easily adapted by other services, which will be easier now their code is open for other agencies to use.

  • Recruitment for accessibility testing, and executing the testing itself, probably occurred too late in the timeline. Recruiting people from their target audience who also use assistive technology was extremely challenging and time-consuming. The team did get creative and tapped into a wide range of networks to find research participants. Assessors would like to have seen more accessibility testing done earlier in the project. Insights learned early enable changes to the design and interface to be done with more confidence, ahead of any last-minute polishing.

Keeping up with the “Hobby or Business?” tool

Visit the Hobby or Business tool.

Code

This service’s code is hosted in an open source GitHub repository: https://github.com/AUSGov/hobby-business-guidance-tool

Metrics

Performance dashboard: The Starting a business dashboard.

The team provided a number of metrics to be considered within the HEART framework throughout the Beta in-flight assessments.

The team have informed the assessors that they have collected baseline data and performance metrics to support public reporting of performance KPIs and other metrics appropriate to the service.

Benefits

The expected benefits of this service are to provide greater clarity to users about their government obligations and whether their activity is defined as a hobby or a business in the eyes of government.

This will help users to avoid penalties for unintentionally doing the wrong thing and can save them time by making the process of getting answers from government about their circumstances more efficient.

The first release of the service provides certainty in some circumstances to a limited user group, however the project has exposed an area of concern to users prompting a number of agencies to begin working together to address the concern.

Subsequent releases of the service will provide more certainty to a greater range of users and will also integrate with existing and upcoming government services.

Assessment against the Digital Service Standard

Assessment result for each criteria
CriteriaResult
1Pass
2Pass
3Pass
4Pass
5Pass
6Pass
7Pass
8Pass
9Pass
10Pass
11Pass
12Pass
13Pass