ABS Digital Census – Beta Assessment
The Australian Bureau of Statistics (ABS) is Australia’s national statistical agency and an official source of independent, reliable information. We tell the real story of Australia, its economy, and its people by bringing life and meaning to numbers.
The Census is a mandatory data collection effort that happens every 5 years. The Census collects key demographic, social and economic data from all people in Australia on Census night, including overseas visitors and residents of Australian external territories, only excluding foreign diplomats. It provides a rich snapshot of the nation at the time and informs government, community, and business.
Having a Census that the public can complete online makes the experience much easier and more accessible for a significant part of the population, as they can choose when and where to do their Census at their own pace. It also reduces cost to the taxpayer and environment as it minimises the production, distribution, collection, and secondary data capture of paper forms.
Past Census’ with digital options, including eCensus (2006, 2011) demonstrated a majority preference of the populace to complete the Australian Census online. Continued development and improvement to that service is therefore required.
The next Census on 10 August 2021 will be a digital-by-choice Census, to reflect that the public can choose whether they respond electronically or on paper.
The service passed the assessment because:
The Census Digital Service team has met all 13 criteria of the Digital Service Standard. This report recommends the service proceed from private Beta to Live, and consider the recommendations outlined in this report. Due to the nature of the Census service, a public Beta stage is not applicable.
Throughout Discovery, Alpha and Beta the team has involved a wide range of participants in ongoing user research, and have built a service that has worked well during the private Beta – operational readiness test.
They have sought to make the service as easy to use for the widest possible audience, while balancing the need to deliver a secure and reliable service that maintains good data consistency across previous Censuses.
The team have a clear remit, are working collaboratively in an agile environment, and are regularly reflecting on how they can continue to manage and grow their service.
Criterion 1: Understand user needs
The team has worked hard towards developing a deep understanding of their user needs and iterating the service accordingly through Beta stage. While user research came to a temporary stop during the early days of the COVID-19 pandemic, the team rapidly changed their processes to remote research so that they could continue to develop their understanding of user needs.
User research was conducted with a broad range of users, including people from Culturally and Linguistically Diverse (CALD) backgrounds, people with a range of access needs and different composition of households.
The team has undertaken regular usability testing sessions. At the time of this assessment, there were over 23 usability testing rounds conducted with over 420 participants. These were undertaken in 16 Australian cities and regional areas and included 2 accessibility specific rounds.
The team has made multiple improvements based on user research and testing. Some examples include changes to login pages, logout buttons, and pre-filled information (such as language) based on prior selection. In another example, they changed the login flow for when a person may not yet have a Census number, so that the individual could still commence their Census, understanding that sometimes the planned steps of a service do not occur in the typical order.
Criterion 2: Have a multi-disciplinary team
The team successfully established and maintained a multi-disciplinary team to continue the design and delivery of the Digital Census Service as they moved through Beta. The team did not identify any capability gaps and talked to their ability to ramp up certain capabilities in the team as needed to respond to what they were learning.
The team also worked more broadly across the organisation to ensure that the Digital Census Service was being delivered in an integrated way with the rest of the Census’ program of work.
It was clear there was strong cohesion within the team, and that they worked hard to maintain this during COVID lockdowns and increased remote working arrangements. The team spoke to the importance of the shared values they established when they started the service transformation, which enabled them to ensure new starters were onboarded with a clear understanding of those values.
Criterion 3: Agile and user-centred process
The team have continued to work in an agile and user-centred way as they have progressed through the Beta stage. While the team has shifted from a strict sprint cadence, they have continued to embrace the principles of agile by maintaining regular communication and agile ceremonies. This includes regular stand-ups and retrospectives, as well as maintaining a backlog and continuing to plan and prioritise around sprints. The change in approach is a result of the team reflecting on their ways of working and refining them to better support their needs.
The teams are working in a self-managed and autonomous way, which is enabling them to quickly address any issues that arise and to continue to work through the user stories defined in the backlog. The team shared their well-developed user stories with the assessors and were able to demonstrate that each user story was self-contained and had all the required information to ensure that it was well understood and had a clear ‘definition of done’. It included elements such as the story itself, the story workflow, acceptance criteria, sign-off by the Product Manager, and any relevant updates such as content or design changes.
The team maintains 2 instances of JIRA between the ABS and its delivery partner. While this is not ideal, they have evaluated moving to one shared instance of JIRA, but the effort involved would have outweighed the benefit for the stage of development that they were in. Instead, the team has worked hard to establish other processes to ensure there is no disconnect with both environments, and that stories remain intact as they move through the workflow. The team has shared access to Confluence which aids their communication.
The team has also ensured they remain connected with the broader Census service, enabling them to deliver a well-connected service that is designed and delivered in context of other aspects of the wider service. They are collaborating with other agencies who are providing call centre services as part of the Census and are ensuring what they learned through the private Beta (operational readiness test) flows into other parts of the service too.
Feedback obtained through the private Beta is triaged and added to the backlog. The team discussed their ability to quickly respond to feedback and defects. An example they gave had a defect resolved and deployed within 2 hours.
Criterion 4: Understand tools and systems
The team has designed and delivered a scalable solution using cloud-based services where appropriate. This ensures the required level of performance can be achieved, including unusual or greater than expected performance requirements.
The solution is supported by deployment and testing pipelines which feed into the ongoing development of the Census. Supply chain security has been and continues to be considered as part of development practices.
The controls for the technical environments are appropriate to the service, with the Bureau of Statistics maintaining ownership of these environments and the support for them. While the cloud environment is supported by delivery partners, this support is within organisational boundaries.
Criterion 5: Make it secure
The team and the organisational stakeholders more broadly have demonstrated an extensive commitment to security and privacy. Change control procedures are well established and design documents have been reviewed by many government agencies as well as key technology partners.
The team has undertaken simulation of failures including “game days” in line with the Amazon Web Services (AWS) Well-Architected Framework to ensure effective and efficient response to incidents.
Supply chain security of software artifacts has been considered. Developers go through secure development training which is accompanied by a static analysis of commits and open-source components. Static analysis is supplemented by automated vulnerability scanning of the testing version and regular reviews by security operations staff.
End user authentication is a core part of the user journey and has been invested in. Similarly, the transfer of data from the cloud environment has been considered with high security demands in mind. Operational needs have been balanced with privacy concerns through redaction of transactional logs at the earliest possible point.
Criterion 6: Consistent and responsive design
The team has taken a ‘mobile first’ approach to their design of the service. The design is based upon the ABS Form design standards, 2021 Census Visual Style Guide, and the Australian Government Design System to provide a consistent user experience that matches other Australian Government websites and services.
The design is also based on the GOV.UK and United States Government design systems and Census, to provide an updated and more intuitive user experience based on global user research of this kind of service.
Usability testing with participants and crowd-source testing has included a range of mobile devices. Large scale testing has also provided a baseline of the range of devices used by people to complete a Census online. This has helped inform the team about which browsers and devices are to be supported for the 2021 Census. Considerations have also been given on how to handle the rare devices that sit outside of the 95th percentile.
Testing has included response and performance under load in a variety of locations and conditions to maximize speed where possible. At the time of the assessment, most were loading at 1.5 seconds or under even at low bandwidth.
Criterion 7: Use open standards and common platforms
The team has considered use of common government platforms where appropriate. The Australian Design System has been used and extended. Other government authentication solutions were considered but there were a variety of technical and policy issues identified that ultimately did not make these appropriate for use.
The team also reached out more broadly, sharing lessons with overseas government online Census teams. This included discussion access pathways without pre-created credentials and how that affected take-up for other countries during the height of the pandemic.
The solution is based on Open Web Platform technologies including consideration of Progressive Web Applications and Responsive Design.
The outputs of the Census process will be released as open data. This has been long standing practice for the ABS.
Criterion 8: Make source code open
The team is not yet at the point of contributing or providing new developed work as open source. Where backend systems are re-used between Census and other corporate systems, it may not be appropriate to release this information for security reasons, especially while the service is in active use. However, there are some novel solutions within the service that may be able to be shared and re-used within government.
Criterion 9: Make it accessible
Accessibility is a key goal in the 2021 Census Design Strategy, with WCAG 2.0 Level AA as the baseline while also testing against WCAG 2.1 Level AA. Along with the activities in the Accessibility Action Plan created by the team, usability testing helped the team to better understand how to ensure the service is accessible to a wide range of users. Automated accessibility testing is also part of the development cycle.
The team worked with an accessibility partner, who conducted 2 accessibility audits in September 2020 on the ABS Self-service and Contact Us Forms, and the Census Digital Form. The audit was conducted against WCAG 2.1 Level A and Level AA, which included testing on a range of assistive technologies, web browsers and user agents.
The team have also been doing user research with people with a range of access needs, including low vision, blindness, dyslexia, and other neurodiversity. Participants with diverse access needs were included in all user research rounds.
In February and March 2021, 2 accessibility-specific usability testing rounds were conducted on the website, chatbot, the login process and the Census Digital Form. This also included assistive technologies such as JAWS, NVDA and Dragon NaturallySpeaking.
Criterion 10: Test the service
The team has demonstrated an extensive testing strategy focused on technical excellence and user feedback.
An authenticated private test site was made available for accessibility testing before public access was enabled.
The team is routinely testing across the software development lifecycle including cross-browser automated testing, functional testing (post deploy, daily, weekly), penetration testing, integration with the ABS system testing and smoke/availability testing of the production environment.
The focus on scalability is evident throughout their performance testing. The team has utilised 2 approaches; an internal test harness with an Application Programming Interface (API) load testing tool and using an external test tool from the front end to ensure the user experience performance is maintained.
The team has been able to successfully execute a large-scale public test, similar to what would be used in a private Beta. This has demonstrated the ability for people to successfully use the service at scale.
Criterion 11: Measure performance
During the private Beta, the team gathered feedback on the service using various methods of collection. This included analytics about how the service was being used, as well as qualitative feedback from an end-of-form survey and feedback obtained through other channels supporting the service. Feedback was prioritised via a triaging process and added to the backlog for actioning.
The team demonstrated the metrics they plan to monitor during the Census, which will help them identify how the service is operating, and if users are able to successfully complete the Census. The team will be looking for trends, using natural language processing for sentiment analysis, and will be looking at how the feedback aligns with what they are seeing in the analytics. The team explained that each channel of the service is empowered, with the ultimate goal being to ensure that users are not being blocked.
The team spoke to the process being established whereby feedback analysis will be conducted, and the highlighted themes will be brought to the daily stand up. This stand up will also include other teams supporting the Census service.
The team was also able to speak to their relationship with the contact centre, and how they have been able to update call scripts based on feedback and learnings.
Criterion 12: Don’t forget the non-digital experience
The team has been thorough in exploring end-to-end journeys and the variety of pathways for people to complete the Census, and not just by digital channels. They did this by checking against each known archetype to determine if there were any gaps in the service.
The team trialed pop-up hubs during their large-scale testing, which have helped support offline channels.
The team have also been iterating and expanding on their contact centre scripts based on testing and research results. Braille and large print versions of the form are also available through the contact centre.
The team has also undertaken work to connect records in the backend for people who start their Census online but need to shift offline to complete it.
Criterion 13: Encourage everyone to use the digital service
All communications to be sent to households will refer to and encourage people to use the digital option.
The team are preparing some videos to raise confidence for people who may be new to using the digital Census or doing the Census for the first time.
There are several inclusive strategies which nudge a shift to the digital option, where appropriate.
Recommendations
- Add meaningful and explanatory error screen(s) when a user’s access is blocked at a network level (including from Tor or VPNs in implementing ISM Security Control 1627) or where JavaScript advertisement blocking prevents functionality at the application level.
- Provide a meaningful and clear explanation of why a device cannot be supported for non-supported devices.
- Monitor dependent cloud services performance including a comprehensive capture of baseline performance level pre-event to ensure any potential issues outside of the direct control of the team are identified early.
- Pre-establish relationships with the major Australian telecommunication provider Network Operation Centres to escalate large scale issues with internet or SMS traffic. This may be via ABS’ telecommunications vendors or via Government Relations staff at the telecommunication providers.
- Conduct further security “game days” simulations. Scenarios to consider include a) where monitoring/alerting is impaired at the same time fraudulent activity is increasing, or b) rapid credential rotation is required due to human error causing compromise. This should help test the processes for Incident Response and Disaster Recovery identified in the Security Risk Management Plan/System Security Plan.
- Contribute identified missing components and usability findings back to the Design System, so that it can be improved for others.
- Look for opportunities to open-source code for other government users in the cloud computing space. Include components that have been decommissioned or will be decommissioned soon to reduce security concerns.
- Develop strategies and tools that will allow synthesising large amounts of unstructured feedback collected from a variety of channels.
- Continue doing regular user research and accessibility testing to ensure that a range of user needs are considered.
- Continue to review and refine metrics implemented to monitor the service, focusing on the outcomes of the service and what metrics help the team understand how well those outcomes are being met.
Assessment against the Digital Service Standard
Criterion | Result |
---|---|
1 | Pass |
2 | Pass |
3 | Pass |
4 | Pass |
5 | Pass |
6 | Pass |
7 | Pass |
8 | Pass |
9 | Pass |
10 | Pass |
11 | Pass |
12 | Pass |
13 | Pass |