surrey logo

logo EPSRC

d

Understanding How Tools are Used in Practice:
reflections on the use of tools

This page does not present a specific tool, but rather seeks to help us think more clearly about how we are using tools, and how we might use them better. You can download a printable PDF of this information here.

gov
 
This material is based on findings from a doctoral study by Daan Kolkman, in which interviews, observation and archival research were used to focus on understanding how models are being used in practice. A large range of tools are used to inform public and private sector policy making. Considering computer models specifically, the UK Treasury revealed that several hundred business critical models are in use in the UK government alone [1]. 

 

How we use tools

Analysis of the use of tools being used in government in the UK and Netherlands identified a range of topics under which we may want to consider our use of tools, and question how it may be improved.
Tools as material objects

Is the tool you are using a physical or digital object we can be 'certain' of, or is it simply an idea or framework which is more fluid, and/or open to interpretation? If the latter, how can we ensure everyone understands the tool in the same way? If the former, how can we ensure everyone understands how the tool works? How can we avoid the tool appearing as a 'black box'?

If the tool is based on computer software, are there many versions of the tool? Are we all using the same version? Can we, as users, adapt or change the tool? If so, how can we manage these changes?

Dynamic tool use

For how long have we been using a tool? If it has been used for a while, has the way we use the tool changed? Is the tool being used in the way the designer envisaged? Is any change in the way we use the tool intentional, or simply the result of our habits or other working practices? Has any change been for the better?

Social tool use

In some cases, a team of people may be making use of a tool or model. This can bring novel challenges to the tool's use. How are tasks distributed across a team? Is there a risk of knowledge/actions falling between team members? How can these risks be mitigated? There are a plethora of roles attached to tools, for example: developer, user, end-user: How do these different roles differ in their perception and use of the tool? How might the structure of an organisation and user's job roles, affect the use of the tool?

How we communicate tools

The analysis of tools used in the UK and Netherlands, also identified a range of non-technical activities those using tools engage in, mostly to help communicate the tools they used. These areas of communication give us a checklist we can use to ensure we are communicating our use of tools effectively.

Scope

We must make clear the scope and 'target' of our tools. How do you do this? How can you improve on current efforts? Can you be clearer about the purpose of the tools you use? Can we be clearer about what factors are included in your tool, and which are not?

Transparency

Explaining the content and workings of a tool is paramount to users trusting, and most importantly, understanding that tool. If it is a computational model, how does it work? If it is a framework, where did it come from? How can we make our tool more transparent? Where can we store information about the tool? How can we teach users about its workings and origins?

Credibility

Tools will often only be credible if they fit with knowledge and beliefs we already hold. This can be problematic if a tool challenges our assumptions. Does your tool fit and/or conform with the knowledge, data or beliefs we have? Is this a strength or weakness of the tool? How can we improve the credibility of the tool if it challenges our beliefs? Can we point to past successes?

Presentation

Our tools need to be relevant for our colleagues. This includes making tools usable for intended users, presenting outcomes from the tools use in a form useful for others. How can we improve the presentation of our tools? 

[1] HM Treasury. Appendix D to Review of quality assurance of Government analytical models. 2013. 

This material is based on the doctoral thesis of Daan Kolkman, a PhD student based at ERIE. Contact: d.kolkman[at]surrey.ac.uk

 

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
 

Related Content

 

Our Approach: Adapt

Concepts: Adaptation

Go to top