Operational Excellence Framework for Schools and Trusts: Technology Effectiveness

Investment and virtual finance. Communication and contemporary marketing. Business analysis concept. Infographic for web banner, hero images.

In his continuing blog series, Andrew Blench explores the 10 domains of the 2024 Operational Excellence Framework for Schools and Trusts from ISBL and ASBO. This post focuses on Technology Effectiveness

Good in this area is defined as:

The technology provided for use by all trust staff and pupils is fit for purpose and effective. Processes for technology support work well and meet the needs of the users. Performance metrics reflect true user experience and drive improvement. Technology users are supported by effective training and materials, especially through change. Users understand their roles in maximising the effectiveness of technology, including engaging in training and following, rather than circumventing, defined processes.

Pace of change

I happen to be old enough to remember the creation of the world wide web in the early 1990s. I recall getting our first desk top computer (yes one between 40 of us) in my office when working in the civil service. In the early days of the internet, access was gained using a dial up modem. A box which literally dialled up a phone number to establish a connection between your PC and the world wide web. It was painfully slow and made a sound like a strangled budgie! Back then the Cloud was the thing that rain fell from. Artificial Intelligence was what we all thought of senior management!

How things have changed, and they are still changing at an ever increasing pace. If we want to provide Operational Excellence, which enables first class teaching and learning, then we need to keep pace, if not be ahead of the change curve in technology.

Technology Strategy

A key measure of success in this area is stated like this – ‘The technology provided to central-functions/schools is appropriate for their tasks, functions correctly, and is available when needed.’

The issue with this target is that it is a moving target. What is ‘appropriate for their tasks’ today most certainly will not be in 2-3 years’ time. This is because what we teach and how we teach is going to change. The implantation of A.I. will revolutionise various tasks, processes and research methods.

This is why trusts, and local authorities need to develop a 3-5 year strategy for the use of technology. This will produce a written strategy, but the key thing here is that the strategy emerges from joined up discussions between operational colleagues and those developing and delivering the curriculum to our young people.

This isn’t an either or strategy either. We focus on curriculum needs or operational needs, but we can’t do both is something I have experienced in my career. That’s a mistake in my view and leads to inefficiencies and demotivated staff.

Budget for Obsolescence

Whilst we could all moan about how ‘things are not manufactured to last’ it’s not going to change the business models of major global corporations. The fact is that desktops, laptops, projectors, interactive whiteboards and other technology all have a limited shelf life. Many trusts that I have worked with have not modelled this or budgeted for upgrades and replacements over a 3-5 year period. This has then created the ‘perfect storm’ when everything starts to run slower and need replacing at the same time.

Even if modelling the costs over a 3–5-year period produces eye-watering figures, at least it makes it possible for senior leaders to make informed decisions. If we can’t renew as we would like to, what can we do and what are our options. There is also an argument to be had around ownership and leasing of technology and equipment. Which option gives us the greatest flexibility and ability to respond to change.

Technology Functional Performance

We have state of the art equipment, but if it isn’t performing well and is subject to lengthy breakdowns we may as well not have it. So how do we measure and what are our technology functions performance metrics looking like? When we operate a helpdesk function for technology how many faults are logged and how quickly are they resolved. What is the impact of downtime? Whether that’s online downtime or the downtime of a piece of kit. In my experience the more sophisticated a piece of equipment then the more that can go wrong and the harder to find a fix.

I realise that I have only just skimmed the surface of a big topic here. I hope that this generates some creative thinking and a focus on this vital area of operational excellence. If it doesn’t then there is always ChatGPT!

Don’t forget to follow us on Twitter like us on Facebook or connect with us on LinkedIn!

Be the first to comment

Leave a Reply