Principles
Here are some things that CDS believes about evaluation:
- Every delivery is subject to evaluation. Evaluation is a constant in the product delivery lifecycle; it takes place in every story, sprint and every phase.
- We evaluate to make decisions, understand our progress, and demonstrate our return on investment.
- Evaluation covers delivery, product performance, partner capacity and outcomes.
- Every delivery team does evaluation. Not all products are alike; delivery teams decide what’s best based on needs, service, phase, partner, solution, time. We make hypotheses about evaluation, and iterate as we learn.
- There are people in teams who lead on evaluation but it’s a multidisciplinary effort.
- Evaluation is a collaborative activity with partners. It’s one of the things we hand over to partners for continuous improvement.
- Plans and results for each product and team are available for CDS and the partners to view, and are linked to from a central place.
Categories of evaluation metrics
CDS collects qualitative and quantitative metrics in 4 categories:
- Delivery
The wellbeing, functioning and productivity of a team - Partner capacity
Partners ability to work with new principles, technologies and techniques - Product performance
The performance and efficacy of the product and its features - Outcomes
The impact of the product or service on people’s lives
Evaluation in the delivery phases
Evaluation metrics and reporting increases in precision and frequency at each phase:
- Discover
Exploring what the metrics should be, how to get them based on research, and existing baseline information about user needs - Alpha
Testing and some tracking of the metrics through prototypes - Beta
Benchmarking and reporting the metrics by releasing minimum viable thing - Live
Reporting the metrics and making continuous improvements to the live product
Examples of evaluation metrics
These are some examples of measurements that could be collected during each phase for each metric category:
Discovery | |
---|---|
Delivery - Information to understand how well we are using our resources in Discovery. |
|
Capacity - Baselining information on how a partner works pre-CDS. |
|
Performance - Baselining existing information on existing product or service performance. |
|
Outcome - Information about what the public benefits might be. |
|
Alpha | |
---|---|
Delivery Information to understand how well we are using our resources in Alpha. |
|
Capacity - Information on partner participation in building the service. |
|
Performance - Information on how well prototypes are meeting user needs. |
|
Outcome Information on how the product or service might deliver public benefit. |
|
Beta | |
---|---|
Delivery - Information to understand how well we are using our resources in Beta. |
|
Capacity - Information on partner participation in building the service. |
|
Performance - Connecting user needs to ongoing improvements in the functionality and interactions. |
|
Outcome - Information on how the product or service might be delivering public benefit. |
|
Live | |
---|---|
Delivery - Information to understand how well we are using our resources in Live. |
|
Capacity - Information on partner participation in iterating the service. |
|
Performance - Connecting user needs to ongoing improvements in the functionality and interactions. |
|
Outcome - Information on how the product or service is delivering public benefit. |
|
Responsibilities for evaluation
Here are the typical responsibilities on a delivery team for evaluation:
- Design
Joining in with setting, capturing and analysing metrics; acting on the evaluation to make design choices - Development
Joining in with setting, capturing and analysing metrics; retrieving data generated by applications infrastructure; acting on the evaluation to make engineering choices - Partnerships
Joining in with setting, capturing and analysing metrics; retrieving baseline data specifically about partner capacity - Policy
Supporting product managers in coordinating the team’s evaluation activities; joining in setting, capturing and analysing metrics, and ensuring that it’s ethical, legal, and aligned to other frameworks - Product Management
Coordinating the team’s evaluation activities; joining in with setting, capturing and analysing metrics; ensuring data is being used to make decisions and engage interested parties - Research
Generating qualitative and quantitative data through research and analysis exercises; documenting findings