UK Government services are undergoing massive digital transformation to better meet the needs of individuals and businesses.
HMRC provide an array of customer facing and internal digital services, facilitating revenues exceeding 1/2 trillion pounds annually.
To meet non-disclosure agreements, information maybe obfuscated. Views written are my own.
Background
In 2007 two encrypted discs holding millions of child benefit records were reported missing by the Chancellor of the Exchequer.
An immediate freeze was placed on data movements in to and out of HMRC whilst a technical solution was rapidly developed to manage the auditing, tracking and security of data transfers.
A decade on, with few updates to the original codebase, the system had become unviable.
The challenge
Enable secure file transfer between HMRC and businesses or government departments.
Secondly ‐ decommission the internal systems managing the data request and approval process, transitioning to a new, robust and accessible digital platform.
The Goals
Improve service robustness and scaleability
Deliver better UX for both internal and external users
Create a strategic data exchange service for HMRC
My Role
Service Design
Working end-to-end to design simple and accessible services aligned to user needs and meeting the government's service standard. Engaging with HMRC stakeholders to define and satisfy business requirements and service ambitions. Often operating within complex and legacy environments; dovetailing new design and technology with HMRC's digital estate.
Manual and automated file transfers process
Interaction & content design
Understanding the bigger picture and refining the detail:
Engaging with our service users to understand context and user needs
Sketching preliminary designs and drafting content for interfaces and user guidance
Building responsive HMTL / CSS prototypes working to WCAG
Testing and iterating the interface to optimise user experience
Working to GOV.UK and HMRC design patterns
Coding prototypes with the GOV.UK toolkit
Research & analysis
Reaching out to build relationships with service users to discover their needs, experiences and identify opportunities.
Leading workshops and focus groups, conducting analysis and gaining quantitive and qualitative insights to drive design decisions.
Researching user needs
Standards & Assurance
Collaborating with the wider Standards and Assurance community providing pier reviews to develop new services across HMRC Digital.
Meeting the digital service standard and promoting Agile practice
Design thinking
Defining and championing the design process, ensuring user needs were captured, ideas prototyped, tested and validated.
Design process; defining ideas and refinement before production
Discovery
When beginning any project a full discovery period is essential. This period allowed our team to understand the problem, engage stakeholders and those from the broader business community.
Workshops were arranged to explore needs, understand experiences, document business requirements and gain awareness of technical constraints.
As-is service mapping
Comprehensive mapping was produced for the incumbent service to help inform conversions across the team and provided a clear picture of the current service landscape.
One-to-one research sessions with service users highlighted some common problems:
Pain-points
The service is too expensive and complex to setup
It took too long to onboard
Files are deleted before I have chance download them
Notifications are confusing
Experience mapping
Opportunities
Design an accessible system for internal and external users
Eliminate the up-front cost and reduce burden on businesses to setup
Simplify the technology stack to improve system robustness
Streamline the registration process to reduce error paths and support calls
Increase file availability window to improve download success rate and reduce support calls
Start with users
During discovery we identified groups of users involved with the exchange of data to and from HMRC.
These included:
Organisations
Other Government Departments (OGDs)
Data Requestors
Data Guardians
System Administrators
Developing personas helped bring users closer to the development process
Through research sessions across HMRC we started to develop an understanding of our user groups. Developing personas helped keep the team and wider stakeholders in tune with our users and their needs. Revisiting and updating personas every few months kept them representative and helped deliver a more user centred service.
User needs
As a business user:
I need to send documents to HMRC so that I can supply information when I’m asked for it
I need to receive documents from HMRC so that I can keep updated with important information
I need to send and receive documents securely so that I can keep my information safe
I need to keep a recore of document movements so that I can prove what has been transfered
I need to send and receive batches automatically so that I don’t have to do it by hand
Phase I
Customer portal
Transitioning to the new digital service had to be split into two phases with different business objectives. Each phase related to the staged decommissioning of the old platform.
The first aimed to create a new customer facing portal to maintain business continuity for 200 existing customers who would be migrated in batches; three core features were needed:
Document uploads
Document download
Document history
Paper prototypes were initially created for the first round of testing. This method provided enough realism in a test situation to give us fast and meaningful feedback.
After testing, results were analysed, themes identified and actions quickly folded back into the design.
At this point we were in a better posistion to use the prototype toolkit to test interactive features and content.
Documents being chosen for upload
Documents confirmed for upload
Documents uploading
Error states for documents which failed to upload
Testing, insights and iteration
There is no substitute for putting your designs in front of real users, it can be reassuring and humbling in equal measures.
We visited businesses throughout the country to put our prototype journeys through their paces. Test scripts were written to consistently measure usability and test the effectiveness of the designs.
Document transfer history showing filtering and results
Iterating features: Developing status notifications into the transfer history results
Insights gained through testing allowed us to further refine features and finally validate hypothesise.
Generally a minimum of 2 rounds of testing was conducted before we were confident to move stories into the backlog for development.
Accessibility = empowerment
We worked to inclusive design principles to ensure the developing service was:
Perceivable - information needs to be visible to at least some of a user's senses
Operable - the interface must be operated by everyone
Understandable - keep content and interactions simple
Robust - content must be technology agnostic
We reached out early in the design process to test with users with a range of digital confidence levels and accessibility needs. This included testing with a range of assistive technology.
In addition to this, research labs and facilities at Government Digital Service (GDS) were leveraged to put the prototypes through their paces.
The GDS empathy bar, helping us understand how using our service could be perceived to someone a range of sensory impairments.
Insights gained through testing allowed us to improve the content and tease out bugs which would otherwise have never been found.
This could be something as simple as reframing sentences to avoid negative language or contractions, to refining punctuation or experimenting with more effective HTML aria markup.
Markup was validated to WCAG 2.1 and the service was formally assessed at the Digital Accessibility Centre in Cardiff.
Continual Improvement
Analytics
Throughout our private BETA phase we used AB testing tools such as Optimizely. This allowed us to test design variants and analyse how they performed.
Tracking user behaviour highlighted which designs were more successful and helped guide and evolve the service.
AB testing dashboard variants
Performance monitoring
Monitoring site traffic gave useful insights into user / tech segmentation. Quantitive data helped understand and identify potential issues such as unexpected high drop-offs.
Using Google Analytics to investigate user behaviour
Phase II
Internal systems
The second part of the project was to research, prototype and build a replacement for the incumbent back-end system which dealt with the request, approval and management of data movements.
Service hypothesis
As a direct result of the data mislaid in 2007, a process had been introduced which assessed the security arrangements around each data transfer before it was permitted to move into or out of HMRC. This process consisted of:
A requestor asking permission to move data
A Data Guardian reviewing and approving or rejecting the request
If approved, the data movement being tracked and transfered securely
The request consisted of a lengthy application covering the what, why and how data was to be moved. This could be anything from emailing a spreadsheet to the decommissioning of a data centre and the hard drives needing to be couriered to a secure disposal location.
One prominent theme that emerged from research were users complaining of unnecessary duplication of information whilst completing a request to move data.
Identifying data sets which did, and didn't change often, enabled us to propose a simplified process. This principle was key in the solution outlined for approval with CTO.
Reducing moving parts: theory looked simple but the landscape was more complex
Data requests were analysed over a 12 month period and common features identified. This began to build a picture of common types of data movements forming the basis of a set of data policies. When a request was created it was based on one of these policies and tailored to a particular circumstance by a handful of input fields.
By standardising 80% of the data input into the data policy, we expedited day-to-day requests by several orders of magnitude.
Contextual Research
Getting out of the office
To test the theory we visited internal teams and external businesses across the country to observe and understand how data was moved in and out of HMRC.
We discovered a wide variation in how users moved and processed data — the picture was looking more complex than expected.
On-site research with the Physical Media team in Newcastle
Meeting users and observing them in their own environments provided valuable insights into how people worked and the challenges they faced with the current process.
Workshops
Understanding the issues
Stakeholder workshops were arranged to deep dive into subject matter and understand needs of data Requestors and Approvers before moving forward with design work.
Workshop with Approver teams to help inform designs
Lean UX
Fail fast & often
Mapping out ideas on the wall was a daily activity which helped the team collaborate and quickly iterate journeys and features.
Initial journey mapping of data request and approval
Visualising ideas and regular discussion created common understanding of problems and enabled us to trouble-shoot early on.
A clear picture of the data request and approval processs
Once any obvious issues had been ironed out and the journey streamlined a simple prototype was built using Google slides.
This method enabled us to easily share ideas with remote teams and made it fast to experiment with new content and design elements.
After receiving feedback from internal teams, interactive HMTL prototypes were built more closely aligned to a production environment, which in turn enabled us to gain detailed insights from users.
Labs
Usability testing
Contextual usability testing was not always possible. In this event we arranged sessions at user research labs within HMRC or at the Government Digital Service (GDS) in London.
Setting up the lab prior to a testing session
We also used the labs to host accessibility testing in advance of our DAC assessment.
Iteration & Refinement
Gaining insights from regular testing, we refined the request and approval process to determine our minium viable product.
This process created healthy debate across the team, where arguments for user needs, business continuity and operational risks were balanced against what our minimum viable product might look like.
Keeping things simple; support dashboard area for internal users
Left: Testing an accessible auto-suggest component. Right: Simple inline help allowed users to complete a task successfully first time
Testing with people of different cogitive and physical abilities and different digital confidence levels, allowed us to develop a service which was accessible to more people.
End-to-end
Service design
We explored the full end-to-end journey for internal and external users, solving the whole problem across digital and physical channels.
From the point a user signed into the corporate network, through raising and approving a data request to the secure delivery of the exchanged data.
Creation of the new service launch icon and contrast ratios checks to comply with Web Content Accesibility Guidelines.
Hard decisions had to be made on what functionality was delivered as part of our first release to support the needs of our internal users. To enable us to meet our release date we had to employ more tactical thinking.
Existing tech was leveraged where possible to simplify the service and deliver our MVP on schedule.
Designing a scaleable service registration process for our MVP
I drafted the guidance for our service which needed to tackle some difficult concepts during the phased transition from legacy systems. The strategy for the guidance was approached in the same way as any content; test, learn and iterate.
We took our guidance out and put it in front of users groups across a mix of abilities and business areas. This was essential to ensure any gaps were identified and the correct messages were discussed and understood.
Drafting, testing and improving the service guidance