DHS Website Reform Project — Live assessment
The Website Reform Project is transforming the department’s public website, humanservices.gov.au.
The Website Reform Project is transforming the department’s public website, humanservices.gov.au. The project will improve the readability of content, update the overall design and interactivity, and develop writing and style standards that need to be met for all future content.
The website has adopted a user interface based on sharing code, designs and approaches to reduce duplication of effort by Commonwealth departments. It includes a new visual design and information architecture that better assists users to find information and complete tasks online.
This report only considers the adoption of the new interface design on humanservices.gov.au because this is the only component of the project ready to progress to Live at this stage.
Areas of good performance
The website user interface has met criteria 1 to 13 of the Digital Service Standard. This report recommends the service proceed from Beta to Live.
The team were able to demonstrate their ability to keep improving the user interface once it goes live, based on user feedback, analytics and ongoing user research.
Criterion 1: understand user needs
Throughout Beta the team has worked to further understand user needs by undertaking extensive testing with users. This ranged from interviewing and observing users in their context as they navigate the website using their own device, to online survey and in-page polling feedback collected from users as they interact with the website. The team has actively sought to include users with diverse needs in all research activities.
During the Public Beta stage the team has received over 1800 pieces of feedback from about 57,000 users. The team has shown how they prioritised findings into a list of work (a backlog) and iterated the website user interface in response. The team has also shown how they will continue to update and work through the backlog when the website goes Live.
The team has put a plan in place for continuing to undertake user research once the website goes live. The in page polling will continue to highlight where the user interface is working well, and where it can be improved. The research the team undertook in Discovery, Alpha and Beta highlighted areas for improvement beyond the user interface. The team has factored this into their forward planning, where future user interface improvements will be done in conjunction with other elements, for example, content and experience pathways.
Criterion 2: have a multidisciplinary team
The team has shown a good understanding of the roles and capabilities required to support the website user interface. The team has put a lot of effort into cross skilling and developing team support materials, which has enabled them to overcome fluctuations in size. The core team has been involved with the service since the beginning of Discovery, and will continue to operate and iterate the website after it goes live.
The core team continuing with the service includes the Product Manager, Service Designer, Technical Lead, Senior Developer and Delivery Manager. Support from the Graphic Design team has replaced the dedicated Visual Designer, and this new arrangement has been working well. The team has developed empathy building documents and a user research toolkit to share with new members/other teams as required.
Criterion 3: agile and user-centered process
The team has followed the service design and delivery process by launching a Private Beta followed by a Public Beta prototype.
The team has logged all testing and user feedback in a central repository that every MDT member can access. The team has identified and improved on three key issues during Public Beta; the relevance of search results, the suitability of icon design and the availability of navigation hint.
The team has shown how they iterated the user interface in response to user research and usability testing. They have kept comprehensive records of this, from how needs were identified to how solutions were tested and refined. One example is the top navigational menu, which was identified as a pain point. The team has been able to show the various iterations of improving the functionality throughout Alpha and Beta as well as how they were shaped by user needs and usability testing.
The team has operated on a weekly deployment cycle, which will continue once the service goes live.
Criterion 4: understand tools and systems
The team has continued to mature their knowledge and skills in utilising the content management and hosting platforms for the website. The team has shown a deep understanding of how to operate, iterate and maintain the service by
- developing system relationship documents,
- assembling instructions and scripts for managing the open source content management framework,
- customising the system to enable auto deployment, and
- being invited by other government departments to share their experience.
The team has also secured ongoing funding for establishing automated browser testing.
Criterion 5: make it secure
The service the team has redesigned provides information only and carries no confidential user data. The site’s content is Unclassified, and stored on govCMS. govCMS has undergone extensive security testing, and has previously been endorsed by DHS ICT Cyber Security.
The team chose to further enhance security by undertaking independent penetration testing, which found no issues.
Criterion 6: consistent and responsive design
The team has shown how the user interface is consistent and responsive across the entire DHS website. The team has adopted an open source content management platform, and used its templates to assist in building consistent appearance and behaviour on elements such as the website header, footer, menu and log in options.
The team has shown how they tested each Beta release on a diverse range of devices, including older models that don’t support newer browsers. When they found interactions weren’t working as desired, they iterated the service to make it suitable on a broader range of browsers and devices, including older versions.
Criterion 7: use open standards and common platforms
The team has built the user interface on an open platform developed by the Department of Finance and recommended for adoption by all government agencies. The team has worked to align the user interface as much as possible to the DTA user interface toolkit, which is an open source set of tools that was developed by the DTA to allow a consistent online experience for users.
The team has actively sought to improve consistency of the users’ experience across government by sharing relevant research, technology, design feedback and recommendations with the DTA and other government agencies.
Criterion 8: make source code open
The team has continued to contribute the source code of the website user interface to the whole of government central repository, govCMS. This has enabled consumption and re-use by the broader government community. Additionally, the team has provided guidance and support to other agencies on how to apply it. The team has done this by engaging with agencies directly, and by responding to questions on a cross agency collaboration forum. The team has also developed a user research toolkit and shared it with other government departments.
Criterion 9: make it accessible
The team has shown how extensive usability testing was undertaken across a wide range of users, including those with diverse needs. This included people with low vision, physical and cognitive disabilities, low level digital skills, different cultural and linguistic backgrounds, and from regional and remote locations.
The team has shown how their service meets accessibility requirements, successfully passing testing conducted by an external, independent body. The service continues to be WCAG 2.0 AA compliant.
Criterion 10: test the service
The team has shown evidence of continued user testing of the website. Additionally, the team has conducted User Acceptance Testing internally, which included test plans for each component of the website, as well as detailed acceptance criteria.
The team has shown that they have considered different scenarios, through developing a release and implementation plan that includes a roll-back option. The roll-back option has been tested multiple times and can be completed quickly, without disrupting other services.
The team has also shown that the Beta site has a high speed performance. This is demonstrated by the results of testing conducted to find out how quickly the site responds when accessed from various browsers.
The team has shown how they are establishing automatic testing for processes and platform performance. The team has re-negotiated an arrangement to allow multiple sites to be setup in govCMS for different types of testing.
Criterion 11: measure performance
The team has acknowledged that this is the first information service of its kind to be assessed against the Digital Service Standard. This is because the service focusses on the user interface and navigation elements of the website rather than the content itself. Given the type of service, the team has only published two out of the four minimum metrics on the DTA Performance Dashboard to date: customer satisfaction and cost per transaction. The team will continue to work closely with the DTA to determine how digital take up and completion rate can be measured for this kind of service.
Through in page polling, the team has been able to show an improvement in customer satisfaction when comparing the existing website with the Beta website.
Criterion 12: don’t forget the non-digital experience
The team has continued to work with customer facing areas to ensure they are ready to support customers using the new website.
The team has developed support materials for staff and users to help them understand the website changes. For staff, they have developed a learning and development product and information sheet that can also be shared with users. For users, the team has developed a ‘Website design changes’ guide that will be accessible from every page of the website. The guide compares the old site with the new look and feel for each component of the user interface, for example log on, homepage and icons.
The team has shown how the layout of the Contact Us page on the website has been improved to make it easier for users to find another channel if they need to.
Criterion 13: encourage everyone to use the digital service
The team has developed a comprehensive communication strategy to promote awareness of the website changes. The website services the whole community, and the strategy employs a variety of mediums and channels to target different demographics. This has included social media announcements, news articles, and website alerts on every page. This is in addition to the ongoing promotion of the website by staff in phone and service centres. The department has also engaged key stakeholders such as Non-Government Organisations and Third Party Authorities who work with DHS customers in the community.
Criterion 11: measure performance
The team should continue to work with the DTA Dashboard team to identify appropriate metrics for measuring the performance of an information service.
Assessment against the Digital Service Standard
Criterion | Result |
---|---|
1 | Pass |
2 | Pass |
3 | Pass |
4 | Pass |
5 | Pass |
6 | Pass |
7 | Pass |
8 | Pass |
9 | Pass |
10 | Pass |
11 | Pass |
12 | Pass |
13 | Pass |
14 | Pass |