Getting user-centred design work into agile planning increments
At the start of 2023 I wrote an article about various resources for understanding agile, user-centred design and government. This follow-up is more practical.
Something that I've helped many designers and researchers over the years figure out is how to get user-centred design work into JIRA, Trello, or whatever agile planning tool their delivery team use. I was lucky to get some good opportunities to figure out how to do this back when I worked at HM Revenue and Customs (HMRC)—thanks particularly to Steve Langstaff, then-consultant delivery manager and now doing great things at the Department for Work and Pensions (DWP)—and then able to consolidate some of this thinking through both Scrum Master and Scrum UX training at BPDTS (now part of DWP Digital). And of course, I reviewed the work of Jeff Gothelf—which not only included the Scrum UX course but also the book Lean UX and the related Lean UX canvas.
Most kanban boards can accommodate user-centred design work and development work
I've seen some teams try and have separate UCD and delivery kanban (work in progress) boards. Scrum UX advises against it for good reason. When this happens—and I've seen this—it ends up being two mini teams that don't collaborate. Better is to have one team with dual tracks of delivery and discovery (which means more emergent design and research) work. It then helps to have a means to differentiate and filter said types of work (we often had 'UCD' or similar as a label that could then be used to filter tickets on the board). This is no different from acknowledging that some work is for quality assurance testers or similar!
From what I've seen, the common Kanban flow of to-do, doing, review, and done is completely relatable to user-centred design work. While a team may have some slightly different language it makes sense to let key members (a product manager or similar) check off what's been worked on and either decide that it needs more work or let it be moved to done.
If your team has extra stages like to-do, doing, review, merge, done - just agree that some tickets may skip steps. Again, this isn't breaking agile.
While agile is about outcomes, it's also about managing workflow
Something that I see designers and researchers get caught up on is trying to force their hypotheses into statements.
While there are some methods like design sprints that double down on testing thing, in reality this can live somewhere else. A hypothesis could take a while or even a few sprints to be proven or disproven, let that live somewhere else.
What a board is helpful for is giving an entire team visibility on what work is being done, and for a product manager to be able to make decisions on the right work being done at the right time.
Not all user-centred design work is the same
One of the biggest things I remind people about is that not all user-centred design work is the same.
There are a couple of create prioritisation matrices as 2x2s:
- a hypothesis prioritisation matrix by Jeff Gothelf
- and a research prioristation framework by Jennaette Fuccella
There are 3 quadrants that I'd like to mention
- ship and measure (in both versions)
- design (light and heavy - Fucella) or test (Gothelf)
- research heavy (Fucella)
Ship and measure
'Ship and measure' work is the type of work that is deemed low enough risk to users to just be put live. This could be content changes, or amending a flow that is based on a lot of existing patterns.
A rule that I'd have with a ticket like this is that the title should be clear about what it is (for example "change content on privacy page") and with clear explanations of the tasks to complete.
Design (light and heavy) or test
However, if something needs testing, there are a lot of discrete tasks - often with time lags that could cross sprints - that mean that it's worth splitting up the work. A process I've often followed is to split up tickets as:
- Plan research (led by user researcher but possibly done as a team - sprint 1)
- Recruit users (led by user researcher - sprint 1)
- Do test script (led by user researcher - sprint 2)
- Do prototype (led by designers - sprint 2)
- Do user research (led by user researcher but with others possibly observing- sprint 2)
- Analyse user research (led by user researcher, possibly even as a team activity - sprint 3)
Within these tickets there will be definitions of done that include that various outputs (research plan, research script, prototype and so on).l
Splitting these up and even having a go at pointing them give the team a chance to understand if there's enough time for the UCD team to work sustainably. It also gives the rest of the team a chance to input as required, be it supporting on design work to meet technical constraints or even attend user research or watch the videos before an analysis session.
As I find these the 'bread and butter' work of UCD teams, I often create template examples of the tickets to copy and then just link them all up (connecting them via dependencies slightly breaks the 'independent' part of the INVEST acronym but does make it easy to track them all).
Research heavy
Some work is just pure user research or exploratory work. That's also fine—teams should be doing a mix of discovery and delivery work. Again, showing this as work in a team makes sure that others are aware of it and contribute. For example, it could be that so-called discovery research could actually benefit from either visual or technical concepts.
One rule of thumb I had for research or workshops: if it's more than half a day of 1 person's time, it should be a ticket. For example, a stakeholder workshop could definitely be one ticket, and depending on the complexity or importance of getting things right, maybe a few (prepare, do, analyse).
Your tickets are your design briefs
I like to imagine a lot of the tickets as design briefs, particularly the design ones. By writing what the work is supposed to be about, it not only makes sure that the designers have enough information to start, but could even help the designers get a head start on said information as part of story refinement.
A tip on an important distinction for teams: definition of done and acceptance criteria are different.
- Definition of done is what it means for a ticket to be done in general (this may have to vary a bit for design and research work but should generally be the same)
- Acceptance criteria is what needs to have been done for this particular ticket to be done. I sometimes use these as a way to track key tasks (e.g. workshop done, journeys created in prototype, journey reviewed)
Not everything has to go into the backlog straight away
Most delivery teams I worked on tried to have a 3 sprint horizon which made sense. However often more distant work was just put in the backlog. Personally, I never found this particularly helpful for UCD work. Often our work is emergent and could get broken down in different ways depending on earlier work.
To this end, I found it easier to have a more general ideas document (usually some form of document with table of contents, though other methods might work) and for the user researcher to have their own more high level and easy to change user research plan. Exactly how you do this with your team might vary, but just as a product manager will usually have different documents for feature roadmaps and so on, it's completely fine to use different things to make sense of 'far away' work.
Working within the team tooling means being able to act as one team
It can feel stifling to have to do all the story refinement, splitting tasks up and so on. I know that I definitely inwardly groaned when a business analyst insistently made sure future UCD work was captured in JIRA. However, aside from being far better than the terror of planning out months of work in a Gannt chart (I did those days and they were terrifying), it also allows for acting like a team. By having work visible, it makes it easy to ask for help on tech information during a standup, and to show in backlog refinement that there's an unrealistic amount of work. Even better, once the UCD team get into trying to point and refine tickets, the team can even start to notice when they should be more involved or insist on less work.