Understanding the true user experience can't be done once; it's a continuous journey. BPDTS's Tristan Thorpe is driving change in the IT Service Management arena, championing user experience measurement (XLA) in the Department for Work and Pensions.
IT service management has been relatively poor as an industry when considering the user experience outside of the traditional technical health metrics such as incident resolution time or system uptime. Service level agreements (SLAs) only measure the quality of a service from the IT side. For example, if a system is up and working, and all the reported faults are fixed within five working days, the service dashboard will show green status across the board.
However, that dashboard doesn't shed any light on what's happening inside a service where the users engage.
A new way to measure service excellence
ITIL has started to focus on user experience or XLAs (eXperience Level Agreements) to bridge the gap. XLAs measure the commitment from a provider to a user on the minimum experience they should expect to receive as part of a service or product. Using SLAs and XLAs together give organisations a more holistic way to measure service excellence.
Is your service a melon?
Watermelons are green on the outside and red in the middle. Focusing solely on traditional health metrics or SLAs can indicate the skin of your service is green. But what if when you cut that service open, the user satisfaction is in the red? XLAs can help organisations measure a poor user experience.
Here's a hypothetical example where the gap between the service level KPIs and the user experience creates unsatisfactory user experiences.
The DWP has a Bereavement section of Pensions. It's easy to tell when that system is up and running from an IT Service Management view. The service managers might also be aware that the system is experiencing a glitch that slows the system down. When the agents are using that system, if it's slow to load or that glitch causes a screen to freeze, the agent could lose the information the citizen has shared. What happens next? The agent must ask the bereaved individual to repeat the information. Does that sound like a good experience? No, it's upsetting for both users, the citizen and the agent.
In the example, the service dashboard showed a green status on the KPIs being measured. Everything looked good on the IT side when the actual user experience inside the shell told a different story.
XLAs help mind the gap
XLAs reveal what's happening between what's getting measured and the experience by identifying the areas of a service that aren't working. XLAs enable organisations to look at the part of a service users don't like, when according to the SLAs in place, they're operating well. They provide the framework to understand the full picture of both the shell and the inside. There are also backend metrics and measuring methods that provide more meat on the bone, e.g., how to calculate it while accounting for subjectivity.
XLAs and SLAs – better together
XLAs don't replace traditional service level agreements (SLAs). Instead, XLAs take those valid SLA health indicators, such as 'fixed in 5 days' or 'achieved 99% uptime' and add user context to give a more comprehensive view of the service being provided. XLAs bring the service health indicators together with the user experience and sentiment. As a result, service management professionals and the business can understand what's happening between system performance and what the user expects as an experience.
New certifications and training
The introduction of XLAs is driving demand to create an eXperience Management Office (XMO). Training provider Bright Horse and ITSMF are collaborating to re-enforce the industry's move to a more user-centric approach. Bright Horse has pulled together a set of industry courses with certificates around XLAs while ITSMF provides member seminars to drive ITIL 4 XLA awareness.
In August 2020, I was asked to attend Bright Horse's new training courses, including the XMO course for which I sat the exam and received the XMO Master. At the time, I was the first person in the UK, outside of the training organisation to sit and pass the exam.
The Bright Horse's first certificate, the Essence of Experience, provided an overview of what XLAs are; the output is a certificate to demonstrate a participant has a proper understanding of what you're trying to achieve through an XLA. The second certificate, the XMO Master, is a new certification that demonstrates a person has the relevant prerequisite information to set up an Experience Management Office on behalf of an organisation.
Creating an XMO
The XMO is the part of the organisation that manages how to measure and quantify experience, get the information back into the organisation from users, how the organisation is trending, and convert that data and insight into something the business can use.
The XMO Master feeds the user experience information back into the business, saying 'This is how your consumers feel about the products and services you're offering, and this is what you might want to do about it'. Where the XLA and XMO come together is in translating what a score means.
Championing the user in IT Service Management
The key thing I've taken away from becoming an accredited XMO Master is it’s no longer good enough to measure only the KPIs around the shell. IT Service Management needs to be better positioned to identify the gap between what the SLAs say is good and the actual experience from the end user's point of view. Only then will an organisation have an accurate picture of what's happening inside their services.
At BPDTS, I'm laying the groundwork to improve service excellence for the Department for Work and Pensions. What I'm doing isn't new; Agile and DevOps are already engaging in iterative feedback in other technology areas, ploughing the user experience back into product development. However, it's a new approach to traditional service management.
Making a case for XLAs
To start, I initiated an awareness session with BPDTS's Senior Leadership Team (SLT) to focus on service excellence. I set up the same sort of awareness session with one of the DWP Digital product groups in parallel. The aim is to mobilise the framework internally, that if either BPDTS or the DWP agree that XLAs is a missing piece of the puzzle, we can move on it by engaging the operations teams that work outside of DWP Digital.
One of the things I've done to get it off the ground is to identify a simple platform we're already using and create a set of pilot questions to test with a small user base. It's a simplistic approach, but the pilot has already generated data we can use to create a framework that achieves the objective of aligning our traditional service level agreement health indicators with the expected user experience, based on the actual user experience.
Measuring the impact of change
To make change happen, an organisation needs to have useful data. But if that data is pointing to the wrong thing, they're not going to be able to change in a meaningful direction. A critical aspect is understanding the impact of a change. How will you know that what you've done has moved the needle on the user experience quality? If you learn what 10 things made people unhappy, and you put changes in place to correct or improve in these 10 areas, if needed you can then frame those questions in the following period to understand the impact.
Asking your users to provide feedback following those changes will measure their impact on the user experience. Your narrative can then be reshaped into a 'You said, we did' format like this: 'You said X; here's what we've done. Has it changed your view of things?' The responses provide the scale to measure whether what you've done has improved the service; is it better, the same, still terrible, getting better, keep working on it? Or it's great; you've fixed the problem?
Depending on the answers, the organisation can measure where they are relative to where they were based on the user experience. The process continues, iteratively and can evolve at a pace commensurate with where the user experience goes. It's what keeps a service relevant over time.
Finding the gaps
The XLAs and the XMO help organisations improve how they measure service excellence. The XMO facilitates collecting and collating that information, including the developments and the questions to be asked; it's a bit like the user research process. It's the part of an organisation set up to interpret the data, working with the business leaders to address the shortcomings or gaps. Here's how that works in practice:
The XLA is the framework commitment that says an organisation wants all users to give the service a rating score of 8. The XMO translates that 8 into what it means, is it a breakdown of core requirements the user wants to know, do they want to feel engaged as part of the service, feel informed, do they want it to be easy to use and available when they want to use it, not when you want to use it.
The XMO pulls it all together, goes to the user and says, 'Give us your information; tell us what you think' based on the criteria. The XMO collates the data, does the number crunching, presents meaningful information from that data, and then works with the business stakeholders to act on the insight based on what the users think.
Creating better services users want
Value generation forms the cornerstone of ITIL4; the whole point of service management is to co-create value. We can only co-create by engaging with our users; it's not about releasing functionality and capabilities we expect users or consumers to like. Organisations are working with users to create better services they want. The XLA concept embraces that same way of thinking.
Adopting an XLA approach highlights how we're trying to stay relevant to the market, plugging into new developments, and new ways of thinking. XLAs embrace the principles of ITIL4 that steer organisations into value-driven ways to engage with users or consumers; they represent the next iteration of how organisations can measure service quality or delivery. It's not a reinvention of the ITIL framework; it's an evolution to enable service management to adopt a more iterative approach to metrics and measurement. The approach allows for continuous improvement within the IT service management disciplines.
Don't be a watermelon
Put the user at the centre of every service. Organisations need to be looking at the user experience as the benchmark rather than the system's uptime. By combining the focus of SLAs and XLAs, by adding KPIs to measure what's happening inside user experience, an organisation has a more comprehensive view of a service's performance from the user's point of view.
1 comment
Comment by Craig Shannon posted on
Unless you are a reverse watermelon, red on the outside (failed SLAs) but green on the inside (happy service consumers/positive experience journey/sentiment surveys), thus indicating the SLAs are not really supporting or representing the service consumers requirements and desired outcomes.
Or you think your SLAs alone are giving them what they want but when working in tandem with XLAs you are giving them what they need, in a way the want.