Government service standard assessment bingo
Service manager can’t explain the service from a user-centred perspective | Key team member has recently left and replacement doesn’t know details | Proplems with using the big screen | Panel member arrives more than 30 minutes late | Parts of service wildly off-pattern with no justification why |
Posters on the wall are too small to read from a distance | No one understands the tech part | No performance analyst | Assisted digital model is to use the phone helpline | Content designer not invited to assessment |
Panel is nice but report is damning | Service team contradict each other | FREE SQUARE | The team is doing agile because they do standups and use JIRA | Little explanation of what happens before and after the service |
No user research plan | No one read the briefing document | Accessibility testing stops at WCAG AA | No comms plan | Panel member talks too much |
Designer barely talks | User research part takes up more time than allocated | Panel spend over 15 minutes questioning the policy intent | Assessment runs over by more than an hour or sections have to be picked up after | Problem with service demo |
This isn’t meant as an inditement of service assessments. However, having been on both sides of the table in more than one department, it highlights common issues I’ve seen. If you don’t get any of these, then congratulations for both your service team and panel for getting things right!
For an editable version or to copy and make your amends, see my spreadsheet.
Member discussion