Improving child care attendance reporting — Alpha assessment

The aim of this service is to simplify attendance recording and data capture for families, childcare providers and government. The service team focussed on how they might improve the method of attendance recording for Long Day Care and Family Day Care providers and their users. Attendance data will be captured digitally for the first time by government. This creates opportunities to meet a range of user needs for end users and improves compliance for government.

Department / Agency
Department of Education
Date of Assessment
September 01, 2016
Assessment type
Digital Transformation Agency-led
Assessment stage
Alpha
Result of Assessment
Pass
Lead Assessor
Jonathan Mao
Service Manager
Peter Glynn


The Childcare Attendance team has met criteria 1 to 3 of the Digital Service Standard and shown sufficient progress towards meeting the remaining criteria. This report recommends the service proceed from Alpha to Beta stage.

The team demonstrated a great commitment to user centred service design throughout their Discovery and Alpha stages. User research supports that the proposed solution might be used to underpin both Long Day Care and Family Day Care attendance recording. The team narrowed their focus for the beta MVP solution to Long Day Care to align with Departmental strategic objectives and deliver value to that cohort.

Areas of good performance

Criterion 1 - Understand user needs

  • The whole team participated in user research activities, resulting in strong empathy which is demonstrated via the in-flight check-ins and evidenced in the team’s workspace.
  • In Discovery the primary user need identified was support for some providers to better manage their attendance records. Initial prototypes that used location data on parent/carer mobile phones to speed up the attendance process did not test well. It was important that the solution also supported carers and parents and created a positive experience for them. The team identified that improvements to the attendance recording process for providers will not negatively impact on families.
  • The team talked to a broad range of users, then consolidated their insights. The solution is expected to work across all providers, including Family Day Care, however the minimum viable product focuses on Long Day care.
  • User needs identified in the broad research will be later used to inform the service roadmap. The team’s stakeholders participated in user research which helped to build empathy and visibility of the existing service.

Criterion 2 - Have a multi-disciplinary team

  • The team has done great work overcoming resourcing and technical challenges from Discovery through Alpha stages.
  • They demonstrated good resource planning that will underpin the sustainability of the team going into Beta and beyond their 20 week time-box.
  • The team actively participate in improving each other’s skills. There is evidence of increasing capability amongst the team gained through a process of shadowing and immersive learning. This has a positive effect on what the team is able to achieve.

Criterion 3 - Agile and user-centered process

  • The team followed the Service Design and Delivery process, and developed an end-to end user journey map informed by their user research.
  • Following agile processes enabled the team to quickly test and iterate their prototypes and define their MVP within a short time.
  • The teams stakeholders have developed familiarity with the agile way of working. The rapid feedback from stakeholders accelerates continuous delivery for the team.
  • The team presented their draft Service Map, and were able to articulate their minimum viable product and provided a good outline of their plan for testing during each sprint.

Recommendations

The assessment panel makes the following recommendations:

Criterion 1 - Understand user needs

  • The team must continue to address their known user research gaps. For example, the ongoing user research plan should include both providers and families in other states and people with disability and with diverse cultural and linguistic backgrounds. The team must consider how users with access needs will use the service.
  • The team should continue to write user stories to focus on the user needs and ensure that user stories are evident in the team’s backlog.

Criterion 2 - Have a multi-disciplinary team

  • The team will need to ensure they can onboard new team members, such as a developer, without slowing delivery progress. Sharing user empathy and understanding should be part of that process.
  • The team must continue to onboard additional technical people as they move through the Beta stage. This will be specifically relevant as the team begins to connect its data and systems.
  • A plan should be put in place to ensure the team continues to work in a multidisciplinary, colocated manner, in order to continue iterating the service beyond the 20 weeks.

Criterion 3 - Agile and user-centered process

  • The team should continue to make use of their skills in agile methods and use them to pair with new team members as they transition back to their agency.

Criterion 4 - Understand tools and systems

  • The team have not yet established an automated testing capability. They should commence research into the technical requirements they need for continuous integration and continuous delivery pipeline.

Criterion 5 - Make it secure

  • As the team progresses through Beta, it is important that they continue to seek specialist advice from subject matter experts in areas that impact on the privacy of families. For example, specialist legal advice helped the team to design a preferred prototype using a telephone number and PIN, rather than a family name.
  • Work needs to be done exploring the data privacy and collection constraints the service must be delivered within. The team should consult the relevant subject matter experts from within their Department, and ensure their service is rigorously tested for compliance.

Criterion 6 - Consistent and responsive design

  • The team must work with responsive design methods especially in the design of the administration section of the service. This will help the team to address one of the more complicated data entry points.
  • The team should explore using more common, open, and established responsive frameworks rather than their own bespoke framework as this may be harder to maintain long term.
  • Time should be invested in researching the various ways people will interact with the service and on what device, and ensure that the behaviour is consistent with best practice.

Criterion 11 - Measure performance

  • The team must focus on measuring their service and be preparing to report their key performance information and other metrics of success on the Dashboard. Baseline data the team has collected will help to rapidly validate the service success.

Examples of prototype

screen capture of the menu selection

screen capture of the confirmation screen

screen capture of the login screen

screen capture of the attendance screen record

Digital Service Standard Scorecard

Assessment result for each criteria
CriteriaResult
1Pass
2Pass
3Pass
4On track
5On track
6On track
7On track
8On track
9On track
10On track
11On track
12On track
13On track