DHS Website Reform Project — Beta assessment
The Website Reform Project is transforming the department’s public website, humanservices.gov.au.
The project will improve the readability of content, update the overall design and interactivity, and develop writing and style standards that need to be met for all future content.
The website will adopt a user interface based on sharing code, designs and approaches to reduce duplication of effort by Commonwealth departments. It includes a new visual design and information architecture that better assists users to find information and complete tasks online.
This report only considers the adoption of the new interface design on humanservices.gov.au because this is the only component of the project ready to progress into Public Beta at this stage.
Areas of good performance
The Website Reform project has met criteria 1 to 13 of the Digital Service Standard. This report recommends the service proceed from Private Beta to Public Beta.
The team has conducted extensive user research throughout the Discovery, Alpha and Private Beta stages and can demonstrate a clear line of sight between user insights and design decisions.
Criterion 1: Understand user needs
The team have extensively tested and iterated the Private Beta product based on user feedback, completing a wide range of user testing and fast, prioritised iterative design cycles. There were five primary user engagement activities undertaken over the Private Beta period:
A restricted Private Beta testing period with a selected internal business area. This involved 150 employees testing how the beta site performed when completing business as usual tasks, and providing general feedback on the look and feel. 22 feedback emails were received, and a number of improvement opportunities and bugs were identified and fixed.
Lab-based usability testing. The team undertook lab based usability testing with 30 users which focused on task completion for 17 major tasks that comprise 85% of website traffic. The research sample included a focus on users with particular needs and backgrounds, including a number of users with low literacy, low digital literacy, visual limitations, over 65s, English as a second language, and new migrants. The sample also included users from a range of metropolitan and regional locations across Australia.
Expanded Private Beta. The team extended testing to an additional 400 staff members representing a horizontal slice of the organisation, ensuring all key stakeholder groups were represented. A significant amount of feedback was received through this testing and the team have generally prioritised and actioned improvements within a day or two of receipt.
Accessibility testing. Accessibility testing was undertaken with two legally blind users using assistive technology, as well as a heuristic review by a usability expert using a screen reader. Automated accessibility testing was also undertaken, and the site has been reviewed in accordance with, and is compliant to WCAG 2.0 AA.
Remote area user testing and research. The team visited remote locations in South Australia and New South Wales with the department’s Remote Service Mobile Unit and gained a wide range of new insights into how customers in these locations view and interact with the department and its website.
In all an extensive range of user research was undertaken during Private Beta, leading to significant refinements to the user interface over only a relatively small amount of time.
Criterion 2: Have a multidisciplinary team
The team have been preparing for the project’s transition to business as usual from 1 July 2017, when several key design resources roll off the project. In the lead up to the transition, Digital Transformation Agency (DTA) team members have been preparing support materials to assist the team when they leave.
Department of Human Services (DHS) team members have performed multiple roles over the course of the project and are confident they can successfully operate and improve the service throughout Public Beta.
Criterion 3: Agile and user-centered process
The team have continued to follow agile rituals in a fortnightly sprint cadence. During Private Beta they also established a daily process for reviewing, prioritising and actioning improvement opportunities arising from user feedback. This allowed the team to regularly release updated code throughout Private Beta.
All team members were able to demonstrate an understanding of user needs, and can clearly demonstrate how the website design has been iterated in response to them. The team meticulously logged user insights and decisions in GovDex, to allow full traceability of all changes made.
Criterion 4: Understand tools and systems
The team are operating the service on the GovCMS content management system and hosting arrangement, which has been in use for over 12 months. This choice was well researched and evaluated by the service team, and has been endorsed by the DHS Chief Information Officer.
The user interface layer adopted the UI Toolkit, which was partially developed by the DTA and recommended for adoption by all government agencies. The team understood the capability of the UI Toolkit and worked with DTA and other government departments to continue the development and where required determine additional components necessary for the DHS website. The core UI Toolkit was upgraded by the DTA to version two during the Alpha stage and the team adapted and responded well to adjust the setup in order to align with the new standards.
The team made the necessary adjustments to the GovCMS platform to accommodate the use of the UI toolkit, as well as implementing other user interface changes. The team also implemented a health check component to monitor the functional performance of the website.
Criterion 5: Make it secure
The site provides information only and carries no confidential user data. GovCMS has previously undergone extensive security testing, and has previously been endorsed by DHS ICT Cyber Security. The team has chosen to further enhance security by undertaking penetration testing, which is scheduled to occur during the next sprint.
Criterion 6: Consistent and responsive design
The new user interface has been built with a ‘mobile first’ orientation and is fully responsive across a wide range of device and browser types. The team specifically tested the responsiveness and suitability of the user interface for different devices, for example, making a number of changes to the tablet layout as it wasn’t displaying very well in initial user testing.
The user interface is consistent across the entire DHS website, and is based on sharing code, designs and approaches to reduce duplication of effort by Commonwealth departments/agencies.
Criterion 7: Use open standards and common platforms
The site is built on GovCMS, which is an open platform using Drupal, developed by the Department of Finance and recommended for adoption by all government agencies. The site aligned as much as possible to the DTA design model, by utilising and building on the UI Toolkit, an open source set of tools that was developed by the DTA to allow a consistent online experience for users.
Criterion 8: Make source code open
The team has iterated the UI Toolkit and developed new components to suit DHS’ context. The team has been proactive in sharing their lessons and development options outside of DHS. For example, collaborating with the Department of Health on components specific to their website.
The vast majority of what they have developed has been made open via the UI Toolkit and GovCMS to enable consumption and re-use by the broader government community.
Criterion 9: Make it accessible
As detailed in criteria 1, extensive accessibility testing was undertaken during Private Beta. This included testing with a wide range of users with diverse needs, and took into consideration age, literacy, digital literacy, cultural background, physical and cognitive ability, and geographical location.
As a result of testing with remote and indigenous users, for example, the team discovered issues with browser compatibility for older devices and limited network connectivity. Accessing the website was challenging for these users, as they didn’t have access to newer devices and the browsers they relied on were no longer technically supported by the site. The team used this understanding to optimise the website to improve backward compatibility and better low bandwidth network support.
Criterion 10: Test the service
The team have undertaken extensive testing of the website. This has included a comparison of the ability for users to complete tasks in the new and existing site.
The team created separate Alpha and Beta development and testing environments, which replicated the existing website content and allowed end-to-end user testing to be conducted. In the Beta site, the team developed the capability to record every change and replicate it across the rest of the site.
The team provided evidence of extensive usability and accessibility testing, detailed in Criteria 1 and 6.
The existing version of the website will be kept as a roll back option should there be any unexpected issues with the new service.
Criterion 11: Measure performance
The team have put a lot of effort into identifying relevant KPIs to track the performance of the new website. The following metrics have been identified, and will be tracked by the team from the beginning of Public Beta through to live:
- Cost per transaction. The team are working with the DTA Dashboard team to define this metric. One solution being considered is to divide the total website budget by the number of annual sessions.
- Digital take up. The team will utilise existing data from the department’s website reporting dashboard on items such as page views and sessions.
- User satisfaction. User satisfaction will be gauged through short surveys on the Beta site, as well as qualitatively through face to face user testing sessions and through comparing web analytics data.
- Task completion. Task completion is difficult to track in the unauthenticated space, so web analytics will be employed where possible, in addition to a question on task completion on Beta site surveys.
The team will also analyse the effectiveness of the new website in reducing calls to the department by tracking and comparing a range of leading indicators, such as the volumes of people downloading a type of form, and final indicators, such as the volumes of specific calls about that type of form. The team are also working on embedding questions about the effectiveness of the new website in call centre surveys.
Criterion 12: Don’t forget the non-digital experience
The team have worked with customer facing areas to ensure they are ready to support customers using the new website. This includes the development of a staff familiarisation learning and development product and information sheet. The website is print optimised, so customer facing staff will have the ability to print customisable brochure versions of content upon request.
Criterion 13: Encourage everyone to use the digital service
The team has developed a comprehensive communication strategy to promote awareness of the Public Beta and encourage user participation. The website services the whole community, and the strategy will employ a variety of mediums and channels to target different demographics. This includes social media announcements, news articles, and website alerts. This is in addition to the ongoing promotion of the website by staff in phone and service centres. The department will also engage key stakeholders such as Non-Government Organisations and Third Party Authorities who work with DHS customers in the community.
Criterion 2: Have a multidisciplinary team
By the end of Public Beta, the team will need to develop a resource plan to fill design gaps and demonstrate they can continue to operate and improve the service when it goes live.
Criterion 3: Agile and user-centred process
As they progress to Public Beta, the team will need to review the feedback management process to ensure it is sustainable with anticipated increase in volumes.
Criterion 11: Measure performance
The team have defined metrics for user satisfaction and cost per transaction. By Beta, the team should be ready to report their performance on the Performance Dashboard. However, given this is the first information service of its kind to measure performance, the team will continue to work with the DTA Dashboard team to identify appropriate metrics for the remaining two, digital take-up and completion rate.