Software Delivery Assessment by Conflux
The success of a modern software delivery effort depends on a broad range of cultural, technical and procedural factors. Efforts to identify what these factors are started soon after computers were invented. The field is probably more than half way to discovering how to do this job well, but the answer is not getting any shorter. Additional factors continue to be discovered and refined in amazing detail.
How can a leader ensure that all the factors needed are present in an organisation?
Techniques such as Squad Healthcheck allow a team to autonomously determine the completeness of their own learning journey, and allow for managers to offer support where teams have issues in common. But while Squad Healthcheck has proven popular and scalable, it only offers only a broad picture of your teams' situation, with limited depth.
Matthew Skelton, founder and Head of Consulting at Conflux and the co-author of Team Topologies, created the multi-team Software Delivery Assessment technique at Conflux, to help organizations develop and evolve the capability to self-assess their own software delivery capabilities in considerable depth. The decks contains 6 themes and 66 success factors helping managers support a culture of software delivery while respecting their teams' autonomy and differences.
Matthew Skelton introduces the Multi-Team Software Delivery Assessment and answers questions
When to use the technique
The technique is easy to apply and results simple to analyse so it can be run as often as needed. Many teams value quarterly sessions covering all the themes.
We believe the technique is particularly useful at times of change, where there has been a turnover of staff, redundancies, where teams have been drastically reconfigured, or in startups or new departments where teams are new.
In short: at any time you have reason to believe that the best cultural, technical and procedural factors have been eroded.
Themes and Cards
The assessment covers ten dimensions in total:
1. Team Health
2. Deployment
3. Flow
4. Continuous Delivery
5. Operability
6. Testing and Testability
7. Reliability and SRE
8. On-call
9. Security and Securability
10. Team Topologies team interactions
We have taken these themes and expressed them as a card deck with each suit representing a software delivery theme. Individual success factors within each theme are distilled onto the 106 cards in the form of a goal, and a description of healthy and unhealthy practices.
We also have a set of 5 point Emoji Cards available as part of our single player set of Virtual Agile Ritual Cards. Individuals or teams can use these voting cards to give a clear answer quickly.
How to run the software delivery assessment on video calls?
The assessment involves a conversation and planning poker style card-reveal of an emoji card. For sessions carried out on video calls, we recommend keeping the screen space for participants holding up the cards. We believe this helps in driving good quality conversations which is a vital part of the process leading to the right discussions and subsequently results.
Setting up
- Find a facilitator and book your meeting.
- Send out a deck of the Software Delivery Assessment cards to each team member. This will enable every team member to get a good understanding of the assessment and the success factors within each theme.
- Keep a stack of index cards handy
- Plan ahead to ease a roll out. Ensure a shadow is present to witness the facilitation and learn to facilitate sessions on their own.
- Use our assessment worksheets for easy recording and calculation of scores during the session or print out the sheets to keep your screen focussed on the participants and the discussions.
- Obtain the details of each of the assessment criteria in the Software Delivery Assessment card deck
Data recording
You can either print out the assessment sheets to record the scores or use our assessment worksheets for each set of criteria to help easy calculation of scores for individual teams and analysis of scores across multiple teams.
How to use the assessment worksheet
- Enter the number of participants in the session
- As participants hold out their emoji votes, count the number of votes per Emoji and enter that number under the respective emoji in the theme being discussed.
- The average score against each factor is automatically calculated with checks in place to ensure accuracy of the data captured.
Once you input your teams' points, the detailed and summary views in separate tabs will be auto-populated giving you average scores across each factor and multiple teams instantly.
Summary view of average scores across multiple teams
Playing the cards
- As you start the session, ensure all players are visible on screen as they share their votes.
- Take the suits one at a time.
- Read the text on the card or encourage players to take turns to read the text.
- The team members rate each of the criteria by holding up one of the 5 point cards BAD (1 OR 2) / MEH (3) / GOOD (4 OR 5) based on the Tired and Inspired guidelines.
- Write down the team scores and notes on the assessment sheets.
- Take photographs of your members on video conference calls holding up their scores against the criteria
- Keep the team on schedule by asking for some discussions to be held outside the session. Note these down on the Index Cards.
- Get feedback from the team on the VALUE and EXECUTION of the engineering assessment - Emoji Cards are sufficient
One of the advantages of remote sessions
With teams working remotely, one of the ways to capture scores is for the team members to hold up their scores on a video call. The facilitator can take a screenshot of the players holding up their cards and paste them into a document. Scores can then calculated later after the session.
This approach has a significant advantage of running the sessions faster and completing in the suggested 2 hours. Counting everyone's scores during a session can be surprisingly time consuming, but with video conference calls, taking a screenshot becomes a clear record which can be processed later calculate overall scores.
Where it came from
The Software Delivery Assessment represents thousands of pages of knowledge from several named authors such as Henrik Kniberg, Mirco Hering, Nicole Forsgren, Jez Humble, Gene Kim, Don Reinertsen, Dave Farley, Alex Moore, Rob Thatcher, Lisa Crispin, Janet Gregory, Steve Freeman, Nat Price, Michael Feathers, Ash Winter and Rob Meaney. Matthew used his years of consulting experience and insights to identify the key elements from all this reading and distill them into a brief card deck. The results have been tested successfully at enterprises around the world by Conflux Digital.