diff --git a/_config.yml b/_config.yml index 2b22f9d62a..c0ecdc0f88 100644 --- a/_config.yml +++ b/_config.yml @@ -521,6 +521,35 @@ ] } }, + { + "scope": { + "path": "research-summaries/*.md" + }, + "values": { + "section-title": "Research summary", + "breadcrumbs": [{ + "title": "About Canada.ca", + "link": "https://design.canada.ca/about/" + }, + { + "title": "Research summaries", + "link": "https://design.canada.ca/research-summaries/" + } + ] + } + }, + { + "scope": { + "path": "research-summaries/index.md" + }, + "values": { + "section-title": "", + "breadcrumbs": [{ + "title": "About Canada.ca", + "link": "https://design.canada.ca/about/" + }] + } + }, { "scope": { "path": "pattern-library.md" diff --git a/research-summaries/business-account-research-summary.md b/research-summaries/business-account-research-summary.md new file mode 100644 index 0000000000..285779b597 --- /dev/null +++ b/research-summaries/business-account-research-summary.md @@ -0,0 +1,106 @@ +--- +altLangPage: "https://conception.canada.ca/resumes-recherche/comptes-entreprises-resume-recherche.html" +date: 2019-02-04 +dateModified: 2019-02-05 +description: "Reviewed the most common reasons why self-employed Canadians and small Canadian businesses call the Canada Revenue Agency." +language: en +title: "CRA business registration and account maintenance" +--- +
+This optimization project with Canada Revenue Agency started in late March of 2018. We reviewed the most common reasons why self-employed Canadians and small Canadian businesses call the CRA. We were looking for opportunities to make it much easier for these callers to find answers and resolve issues directly on Canada.ca. The long-term goal was to reduce the volume of telephone interactions in areas that could be better served online. This will let agents focus on answering the tougher questions.
+CRA Call Centre’s Strategic Planning and Operations Division did a lot of analysis to help identify the most common reasons for calls. They looked at data they gather regularly about call topics and frequency. They had also recently completed an in-depth study of business enquiries with front-line agents.
+Together, we defined a set of potential task scenarios for testing. We then asked call centre agents and program area experts to validate, improve and narrow down these tasks using an online survey. This process left us with 8 task scenarios. We recruited 18 participants. They were all business students who did not have their own business number, and had never assisted a business with tax issues.
+The baseline moderated task performance testing showed that, on average, the 18 participants found the right page 44% of the time. Out of a total of 142 task trials across the 8 task scenarios, participants succeeded at the tasks only 34% of the time.
+To develop an optimized prototype, designers and researchers from the TBS Digital Transformation Office formed a multi-disciplinary team with the Canada Revenue Agency. The team included representatives from CRA’s user experience, web publishing, Business Information Systems, My Business Account, and GST/HST programs.
+Our interaction designers created the prototype in GitHub. We replicated the parts of the CRA website that we had tested. Through an intensive series of workshops, small-team discussions and individual team members analyzing video evidence from baseline testing, we made many updates and improvements to the prototype over a period of four weeks. At that point, the prototype was approved as ready for testing. If test results were positive, CRA would continue with the process to get it posted on Canada.ca.
+Success! Our target was to improve both findability and task success by a minimum of 20 percentage points, or to exceed 80% for both measures. In the validation round, 20 participants found the correct destination 84% of the time. That was an improvement of 40 percentage points, meeting both targets. Successful task completion increased by 38 percentage points, to a total of 72%
+ +Baseline measurement at start of project, validation on prototype redesigned by project team.
+Task | +Baseline | +Validation | +
---|---|---|
1. Do you need a BN | +28% | +44% | +
2. Federal and Provincial BNs | +44% | +84% | +
3. Information to provide for GST registration | +22% | +74% | +
4. GST Registration | +28% | +83% | +
5. BN for export | +47% | +63% | +
6. Tax Center address for documents | +18% | +61% | +
7. Address change, via My Business Account | +61% | +95% | +
8. Keeping a GST account active | +22% | +74% | +
We used keywords like “registration,” “tax,” “update,” and “changes” in labels and link descriptions on Business and Taxes theme pages, menus and topic pages.
+We brought related items together, and put answers under the steps and headings where people expected to find them. We reorganized, relabeled and restructured. We streamlined a three-page list into a single page.
+We “front-loaded” labels by putting keywords at the beginning of titles, links and doormats. We also streamlined text, so that the information most relevant to the majority of people was presented first.
+To find a correct address, we built a simple wizard. It uses a set of questions and a postal code look-up to provide the precise answer for where to mail your business tax documents.
+ +Two web pages are shown side by side. The page on the left is labelled “Baseline” and shows the “Business number” webpage which was missing the content that people expected to be on this page.
+The page on the right is labelled “Validation” and shows the “Business number registration” page with the new topics. Arrows point to the new doormat links with the annotation “Leading with keywords.”
+If you’d like to see the research findings from this project, let us know. Email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+The Canada Child Benefit (CCB) contributes to reducing child poverty in Canada. In early 2019, the Digital Transformation Office (DTO) partnered with the Canada Revenue Agency (CRA) to redesign the Canada Child Benefit pages on Canada.ca. The team’s goal was to reduce calls to the call centre for key questions. After the redesigned pages went live in 2020, the CRA call centre confirmed that call rates had indeed dropped.
+To meet the needs of parents on mobile phones, the team co-designed a new pattern we called the subway navigation. The existing design was broken into steps, but wasn’t effective on phone screens. The new design’s visual representation of the links resembles the stops on a subway map.
+This research summary explains the context of the research project in which the subway navigation pattern was created and all the considerations that were taken into account. It also highlights other innovations from this project.
+Subway navigation has proven to be an effective pattern to lead users through a service that involves several steps. Analytics and usability testing have shown that it keeps users ‘on track’ with improved findability and success rates. The CRA later adopted and extended the pattern for the successful Canada Emergency Response Benefit (CERB) and Canada Recovery Benefits pages.
+By understanding when and how to use subway navigation, you can use this powerful pattern correctly.
+In 2019, the Canada Revenue Agency (CRA) sought to reduce calls from Canadians who couldn’t find or understand the web content that explains how to receive or maintain their child benefits. The Digital Transformation Office (DTO) partnered with CRA to work on this top task.
+The first step was to conduct workshops with stakeholders, including call centre staff. The intent of the workshops was to understand the top call drivers.
+The team used call centre data to characterize common issues. They then generated an extensive set of real-world scenarios that reflected the problems and contexts that drove the highest volume of calls. They used this data to create testing scenarios for baseline and comparison tests.
+Top call-drivers included:
+The team selected a set of these scenarios to establish a baseline measurement. 20 Canadian parents on phones attempted the scenarios in a moderated usability performance study. The overall success rate was 28% across 7 task scenarios. An additional 5 scenarios were tested but fewer than 16 participants completed them so they weren’t included in the overall success score.
+Task | +Scenario | +
---|---|
Payments stopped | +You didn't get your usual child benefit payment in May/July. Which of the reasons below would cause payments to stop? |
+
Calculate payment | +Mart's second child was just born. How much Canada Child Benefit will Mart get every month? - Mart lives in Quebec, single with sole custody. 1st child is 2 years old - Made $60,000 last year - Will be on leave for next 12 months so will only make $30,000. |
+
Payment date | +Baseline: Exactly which day in July will your Canada Child Benefit payment be deposited? +Optimization: Exactly which day in December will your Canada Child Benefit payment be deposited? |
+
Share custody percentage | +Peter's kids are living with his ex. They will start coming to stay with Peter for 2 weekends per month. Should he apply for the Canada Child Benefit? |
+
Shared custody percentage | +If you were separated and sharing custody, could the two of you choose what percentage of the Canada Child Benefit each of you will get? |
+
Payment less in July | +Petra's July Child Benefit payment arrived and it is much less than she received in June. What is the most likely reason for this change? |
+
Direct deposit change within a month | +Is it safe to close your old bank account before your June 20th Child Benefit payment? You changed your direct deposit to a new bank account number on June 5th. | +
The most common problems were not finding the page that held the answer (32% of participants had that problem) or not being able to determine the correct answer from the information on the page (for example mixing eligibility with entitlement).
+The team focused on prototyping a design that would solve the problems observed in the baseline test. The existing pages used the service initiation template with the ordered multi-page navigation pattern. Parents on phones couldn’t use the multi-page navigation effectively.
+It took up too much of the screen, and didn’t convey the grouped nature of the pages. Some parents thought the page sets were “On this page” options. They also tended to avoid the Overview page.
+The team replaced the multi-page navigation pattern with the new subway navigation pattern. The subway pattern differentiates sections in a tighter space and conveys the connections.
+Subway navigation includes:
+Both the index page and the step pages use a visual representation of the links between steps that resembles a subway map.
+In the subway navigation pattern, an index page replaces the Overview page. It has a title, brief description, and a step graphic that matches the subway steps menu. It provides an easy-to-scan landing page for the service that outlines each step. Landing pages help with both search engine optimization and support the information architecture. Users can navigate back to the index page through the breadcrumb.
+In the original Service initiation template with the multi-page navigation pattern, people had problems navigating between pages, often selecting the button for the page they were already on. The Subway pattern makes the steps clearer without repeating the title. It also positions the "On this page" pattern after the subway links. On desktop, the subway links appear on the right of the screen.
+In the Service initiation template with the multi-page navigation, page titles appear as buttons with no space for description. Word wrapping throws off the alignment, so content designers often limit the number of words to control the size of the buttons. In the baseline study, this made it harder for people to choose between the buttons because the labels weren’t clear enough.
+In the subway pattern, the link text can be longer. The index page has space for a description below each link.
+The optimization project included many other successful changes focused on answers over information. Use of subway navigation was complemented by:
+The answers users are looking for should be integrated into the grouped page content. In the baseline test, some answers about custody were in a PDF booklet that some participants refused to open on their phones. In the comparison study, we incorporated the booklet answers directly into the step pages used in the subway navigation. This integration ensured that both custody tasks reached 100% success, compared to baseline rates below 50%.
+We redesigned page titles and headings to answer top user questions. For instance, baseline participants looked under “Apply” to know “who can apply” as well as “how to keep being eligible”. The subway design had separate page titles for each of the answers.
+The original pages had few headings buried in long blocks of text that covered every situation. People struggled to find answers on their phones.
+In the comparison study, the team used the expand collapse pattern to collapse answers that applied to only specific groups of people. This approach was very successful at shortening the pages, and helping people choose the right answer for their situation.
+On the live site, a long disclaimer blocked access to an important benefit calculator tool, with an ‘I accept’ button below to launch it. Participants failed to scroll past the disclaimer to the button, and some who did find the button hesitated to choose “accept”. The team moved the disclaimers to the answer phase in the tool, where they were relevant, and removed the ‘I accept’ step.
+The overall task success went up from 28% to 83% when DTO retested the task scenarios on the prototype. This prompted CRA to move quickly to change the live pages to reflect the new design.
+Call volumes dropped: the redesigned pages went live in December 2019. In Q1 of 2020, calls for CCB asking “how much will my payment be” dropped by more than 50%. In contrast, calls for the same question for the GST credit - without redesigned pages - didn’t drop during the same period.
+Calculator use increased: use of the Benefits calculator doubled after the page name changed to “How much can I get” and the calculator was highlighted with a green button. From April to June 2020, visits to the English benefits calculator page were almost double what they were in 2019 (609,703 in 2020 compared to 341,492 in 2019). Visits to the French page increased by more than a third (118,956 in 2020 compared with 76,124 in 2019).
+Involve the call centre in optimization projects. To achieve the outcome of reducing calls, collaborating with the call centre team was essential. Know your top call drivers, and frame the content to answer those questions, rather than simply providing all the information for people to find the answers themselves.
+Design for mobile users. For high-mobile services, the subway pattern is a substantial improvement over the multi-page tab navigation. It breaks up unique steps into pages that clearly convey the steps and journey on a phone.
+The subway pattern has improved navigation for many services at CRA. It started with the Canada Child Benefit and was also widely used to communicate COVID benefits during the pandemic. CRA has continued doing usability testing and has continued to improve the design on new benefits pages.
+Other departments have started to use the subway pattern for its usefulness and flexibility. Before using the pattern, it’s important to know when to use it and how to prepare the content to work well with the pattern.
+Employment Insurance (EI) is a service that touches almost 2 million Canadians every yearFootnote 1. Certain EI benefits are designed specifically for Canadians caring for a critically ill or injured family member, or for someone needing end of life care.
+The Digital Transformation Office, Service Canada, and Employment and Social Development Canada collaborated to make it easier to find, understand and access those EI benefits.
+Improving client service delivery is a top priority on Canada.ca. The team was motivated to help Canadians understand which benefits are available to them and how to apply. Awareness of this motivated the team to improve the information provided on Canada.ca. We researched, designed and tested a new prototype in just over three months. It succeeded in:
+We used a set of eight tasks related to caregiving benefits to test the existing pages. We performed a total of 148 moderated task performance tests. After making changes, we ran a further 136 tests on the revised prototype. Overall success rose by 28% percentage points on the new design.
+In the baseline round, 19 participants tested the 8 tasks that our team had designed. They used the live Canada.ca website to measure:
+In baseline testing, the 8 tasks had an average success rate of 51%, and an average findability rate of 59%.
+A few of the tasks were particularly challenging for the Canadians invited to participate in the baseline round. These included tasks asking if participants could find and understand:
+The project evaluated the effectiveness of labels in the Canada.ca menu. Analytics data showed that Canadians use the word “leave” to search for EI special benefits, not “EI” or “benefits.” Introducing the word “caregiving” also helped Canadians identify the right links to follow.
+The team decided to bring the three different caregiving benefits together into a single service template. We used a table to highlight the distinctions between each benefit.
+In baseline testing participants found differentiating between caring for a child or an adult relatively straightforward. However, they had trouble when they tried to identify whether they needed “compassionate care” or “family caregiver” benefits.
+Another innovation in the optimized prototype was to give an overview of the different steps in the application process. The process includes gathering documentation from employers and from medical professionals. Then applicants must submit these documents to Service Canada. On the live site, this information appeared under “After you’ve applied”. Some participants failed to identify the required documents because they didn’t think to look there.
+In the prototype, we put the list of documents at the beginning of the process. Participants were more successful finding what documents they would need to provide.
+ +Two webpages are shown side by side. The page on the left is labelled "Baseline" and shows that the “After you've applied” webpage concerning caregiving benefits was missing content people expected to find on this page. An arrow points to the webpage with the annotation "Supporting documents were not found here."
+The page on the right is labelled "Redesign” and shows the “Apply" page with a set of icons outlining the various application steps. An arrow points to the steps with the annotation "Visual overview of the complete process."
+Below there is a header titled "Begin to gather supporting documents", which has a list of the needed documents. An arrow points to the list with the annotation "All required supporting documents visible."
+Once the revised prototype was ready, we recruited 17 new participants to complete the same 8 tasks. Our target was either 80% success, or an improvement of at least 20 points over the baseline score for both findability and task success.
+These targets were exceeded:
+This chart shows the task success rates across the baseline and redesigned validation test on the prototype:
+ +Baseline measurement at start of project, validation on prototype redesigned by project team.
+Task | +Baseline | +Validation | +
---|---|---|
1. Eligibility: 600 hours | +61% | +76% | +
2. Apply: documents | +41% | +76% | +
3. Wait period/delay | +58% | +100% | +
4. Apply | +79% | +88% | +
5. Compassionate care | +0% | +24% | +
6. Eligibility: sick person | +67% | +88% | +
7. Benefit expiry | +72% | +94% | +
8. Eligibility: friend | +39% | +88% | +
The team identified four factors that had the greatest impact on success rates:
+If you’d like to see the research findings from this project, let us know. Email us at dto.btn@tbs-sct.gc.ca.
+Let us know what you think about this project. Tweet using the hashtag #Canadadotca.
+The Digital Transformation Office (DTO) and Immigration, Refugees and Citizenship Canada (IRCC) collaborated on the citizenship test optimization project in the summer of 2018. In this project, the team sought to make it easier for people to prepare for the test that most new citizens must take to complete their application for Canadian citizenship.
+A team of content designers, design researchers and interaction designers worked with IRCC to define a set of 6 task scenarios related to the citizenship test. Using these scenarios, we completed a total of 214 tests with 36 participants. We tested the existing pages and then re-tested our prototyped improvements. Overall, task success rose by 17% percentage points - to an average of 86% - on the revised prototype.
+We selected 19 participants to test the existing pages using the 6 tasks that our team had designed. Our team was trying to measure:
+We generated real-world scenarios that reflected what people are actually trying to do. + For example:
+Using the results from the baseline testing, the team designed a prototype with task flow in mind. We wanted to simplify the information about the steps after applying and make it easier for applicants to use the study guide. We clarified topic labels and reduced the number of options on the page to eliminate distracting clutter. By using verbs and clearer language in the links and descriptions, the team made it easier for users to discover the right path to the information they needed.
+On the live site, the steps to becoming a Canadian citizen were not numbered or listed sequentially. This caused confusion for users. In the prototype, we numbered the steps and grouped them together so users would understand the sequence in which they should be done.
+We clarified and numbered the steps involved in applying for Canadian citizenship. This made it easier to understand and follow the process.
+We experimented with a new long-document pattern. We added a “search within” and a left-hand navigation that shows all other chapters. Users were able to easily search the specific chapter and see at a glance other relevant chapters.
+We removed inline links to help users stay focused on the task. This way they were not distracted by links to additional information. We answered questions they might have directly in the content instead of linking away from the page.
+Here is one example of how we reorganized information:
+All the information for this chapter (chapter not numbered) was on one page. The only way to search was to use Ctrl + F. Users would commonly get lost when they began to scroll.
+By adding a “search within” and a left-hand navigation, users were more successful at navigating within the document and finding what they needed.
+Success! Our target was to improve both findability and task success by a minimum of 20 percentage points, or to exceed 80% for both measures. We re-tested the same 6 tasks on our prototype with 17 new participants. Successful task completion increased by 17 percentage points, to a total of 86%.
+This chart shows the success and findability rates of all tasks in baseline and validation phases:
+Baseline measurement at start of project, validation on prototype redesigned by project team.
+Task | +Baseline | +Validation | +
---|---|---|
1. Steps to citizenship | +68% | +63% | +
2. Study methods | +90% | +100% | +
3. Language skills | +48% | +82% | +
4. Canadian flag | +48% | +82% | +
5. Inuit meaning | +74% | +88% | +
6. Take which documents to the test | +89% | +100% | +
Users succeed better when we:
+If you’d like to see the research findings from this project, let us know. Email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+Millions of calls come in to the Canada Revenue Agency (CRA) Call Centre every year from Canadians asking about their taxes, benefits and businesses. At least 5 million of the callers look up the phone number on Canada.ca. This pushes CRA’s Contact page up into one of Canada.ca’s top tasks.
+It’s possible to complete many common tasks online, like changing an address or finding the correct mailing address for a form. If Canadians knew how to complete these tasks online, many calls could be avoided. This would free up agents so they could resolve more complex problems by phone. Using call centre evidence and web analytics data, the Treasury Board Secretariat (TBS) Digital Transformation Office (DTO) partnered with CRA to work towards improving outcomes.
+CRA call centre agents and specialists from the Cross Channel Optimization Section in the Call Centre Strategic Planning and Operations Division joined the team. They generated an extensive set of real-world job story scenarios. These scenarios reflected the problems and contexts that drive the highest volume of calls.
+For example:
+The research team turned these job stories into tasks for participants to perform in test sessions.
+The Government of Canada’s Digital Standards start with ‘Design with users’. This focuses on research and testing with users, something the DTO has been doing from day one. At the core of our process is a baseline test to measure how well users are doing at top task scenarios on the current design. We want to identify the issues preventing them from succeeding. Since we wanted to test both personal and business task scenarios, we recruited Canadian small business owners as participants for the research sessions. Business owners are familiar with business challenges and they also have to do their own personal taxes.
+Our participants reflected the range of digital skills in Canadian business owners. A few who work mostly hands-on with their clients weren’t familiar with how to copy/paste text from one web page to another. Others were expert online users who ran their own websites.
+As well as measuring task success rates, the team analyzed user behaviour by watching videos of the tests. This let us see where people struggled. One of the big challenges we identified was that people sometimes failed to get to the Telephone numbers page. They first had to click on the Telephone enquiries link and then go through an overlay where they had to select the type of enquiry. If they did get to the Telephone numbers page, they often chose the wrong phone number from the long list.
+ +Using the results from the baseline testing, the team designed a prototype to address the problems. We focused on the following key design challenges.
+The team created custom contact pages for each of the top call drivers. This way people could find the right contact phone number for their task with a sign-in button above it.
+We built a ‘Find a phone number’ wizard and a chatbot. This helped people quickly find the number they needed without having to look through all of the available options.
+Whenever there was a choice to make that would force a unique answer, we used the collapsible design pattern to hide answers they didn’t need. For example, the phone numbers are different for changing your home address versus changing your business address. Rather than showing both numbers, we first made them choose home or business. By choosing they revealed only the number they needed.
+We made every page with a phone number show a primary action button with the online option first. People usually don’t want to call. If there is a way to complete the task online, it’s important to make it as easy as possible to sign in and do it.
+In the baseline study, we noticed that people ended up on the Telephone numbers, Hours of service, or the Contact us pages. Those pages had links to other pages with key additional details. But people rarely followed the links. This meant they didn’t realize for example, that a particular phone line wasn’t open in the evenings or that they had to get some documents ready before they called. The new design presented the phone number along with a checklist to prepare for the call. This included details like the hours and the service offered at that number.
+ +Overall task performance went up from 40% success to 85% when we retested the task scenarios on the prototype.
+This chart shows a comparison of task success rates between the baseline and validation tests for all participants.
+ +Task | +Baseline | +Validation | +
---|---|---|
Proof of Income Sundays - Tax Information Phone Service (TIPS) line | +21% | +56% | +
Direct deposit - Individual tax enquiries (ITE) line | +56% | +82% | +
Notice of Assessment (NoA) - ITE line | +36% | +82% | +
NoA via My Account | +73% | +93% | +
Child Benefits - Benefits line | +50% | +83% | +
Monday - Telerefund line | +65% | +88% | +
Misallocated payment - ITE line | +12% | +83% | +
Corporate - Payment arrangements | +28% | +83% | +
Report a scam - Anti-fraud | +67% | +100% | +
Address for T2062 form - Surrey King George | +22% | +89% | +
Address for RC1 form - PEI | +0% | +100% | +
Security code - ITE line | +53% | +76% | +
People are more likely to find the right phone number or address when we:
+If you’d like to see the detailed research findings from this project, email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+The diseases and immunization optimization project kicked off in the fall of 2017. The goal was to make it easier for Canadians to make informed choices about vaccination. Designers and researchers from the TBS Digital Transformation Office joined forces with Health Canada and the Public Health Agency of Canada. Together they pulled together a multi-disciplinary team from web communications, immunization and disease programs, and strategic marketing. It was important that this project align with the National Immunization Strategy’s goal of ensuring Canadians have the information and tools needed to make evidence-based decisions on immunization.
+The discovery phase to understand the current situation. Was the content on Canada.ca meeting people’s needs? What problems might they be having? We researched vaccine hesitancy and how that's been addressed around the world. We reviewed previous usability studies and analyzed website traffic patterns and search behaviours. We reviewed phone and support requests. This helped us to determine target audiences, which we then used to talk to people affected by the current content.
+We also found that most people were accessing the content using mobile devices. People were checking whether they had the cold or the flu, when they needed vaccinations, or which vaccinations they would need for a specific country on their smartphones.
+From these discovery insights, the team generated a set of job stories. Job stories are real-world scenarios that reflect what people are actually trying to do. They follow a structure like: When I … (the situation), I want to … (the motivation), so I can … (the expected outcome). For example:
+[Task being voted on, which is on a printed paper sheet]
+It is flu season and a family member is pregnant. Find out whether or not she should get a flu shot while pregnant.
+Choose your answer below:
+[Sticky notes are placed on top of the printed out task]
+[Sticky note 1: Job story] When I get pregnant I want to know what vaccines I can get so I can protect/not harm my child [There are 10 voting dots on the note]
+[Sticky note 2] Difficult in imm, not flu
+[Sticky note 3] Flu
+[Sticky note 4] Immunize
+From the long list of job stories, the team voted to narrow down to the top 11 stories. We crafted these stories into task scenarios that could be used to test the website with people in our target audiences.
+Before making changes, we set up moderated usability tests. We wanted to understand how well the existing content on Canada.ca was performing with the public. We recruited 16 employed parents of children under 18. These participants attended research sessions in Toronto and Ottawa with their smartphones. Participants were asked to complete the 11 task scenarios on the Canada.ca website. If they had time, they repeated one task on the Ontario.ca site for comparison.
+The task scenarios were presented in random order, except for the first and last tasks. Some tasks began on the home page of Canada.ca while others began on the Zika, Flu and immunization pages. The scenarios were presented on a tablet beside the participant. The participant could refer to the tablet while they attempted the task, and then return to enter in their answer and proceed to the next question.
+The 3 important measures for optimization studies are:
+For the baseline test, the overall findability rate was 66%, and the overall success rate was 53% across the 11 tasks and 176 task trials.
+We watched videos of the testing sessions together. This helped the entire team understand the problems people had in trying to complete the tasks. We could observe and quantify behaviours and usability issues that caused task failures. We captured these observations in click paths to understand how people moved throughout the site. We then described the problems in detailed issues so we could address them during the design phase.
+To address the list of issues we captured, the team created a working prototype for a new design on Github. We held several workshops with smaller teams to work intensively on issues that participants experienced. For example, the Zika team realized that test participants hadn't absorbed the serious risks of pregnancy after exposure to Zika. The project team worked hard to integrate content to counter vaccine hesitancy. And new home page designs for Canada.ca were created. These expose the links to Health and Travel that people missed in the baseline tests. (Before, the links were hidden in a menu that remained closed unless the menu icon was clicked).
+Throughout the design process, we tested content with people through small, informal “guerilla” sessions. These tests revealed problems with our designs that we were able to address before full-scale moderated usability testing.
+ +Two smartphones are shown with 2 different web pages. One is labelled "Baseline", the other "Redesign". In the "Baseline" page, the title is "Countries with recent or ongoing risk of Zika virus infection", followed by 2 long paragraphs of text that are too small to read. Following that is a list of countries beginning with "A", starting with Angola, Anguilla, Antigua and Barbuda, Argentina, and Aruba. An arrow points to the country list with an annotation "Too small to touch. Click to see answer.".
+In the "Redesign" page, the title is "Zika virus: Destinations with risk of Zika". Below is a list of 6 links that are too small to read, one of which is highlighted. An arrow points to the highlighted link with the annotation "Tasks grouped together. Simplified title".
+Below that is a short paragraph of text too small to read, followed by a search box and a table showing countries with "No risk of Zika", "Low risk of Zika" or "High risk of Zika" next to them. An arrow points to the countries with an annotation "All countries listed. Answers in view."
+Once the revised design was ready, 16 new participants were recruited to complete the same 11 tasks. Our target was either 80% success, or an improvement of at least 20 points over the baseline score. The revised content and design improved findability rates from 66% up to 90%. Overall task completion success rose from 53% to 84%.
+This chart shows the task completion success rates across the baseline and redesigned validation test on the prototype for all 32 participants:
+ +Baseline measurement at start of project, validation on prototype redesigned by project team.
+Task | +Baseline | +Validation | +
---|---|---|
1. Advice on avoiding flu: shot | +69% | +87% | +
2. Austism and immunization myth | +13% | +80% | +
3. Zika test after trip | +75% | +100% | +
4. How to avoid Zika: countries | +69% | +100% | +
5. Newborne 1st vaccination schedule | +50% | +75% | +
6. Flu shot when pregnant | +75% | +94% | +
7. Flu symptoms: headache, fever | +50% | +94% | +
8. FluWatch: case comparison | +94% | +88% | +
9. Italy: Measles health alert | +50% | +69% | +
10. Vaccines for trip to Mexico | +13% | +69% | +
11. H3N2 flu in this year's vaccine | +31% | +73% | +
32 total participants
+The team derived this set of 6 design principles that appeared to have the most impact on improving success rates:
+Email us at dto.btn@tbs-sct.gc.ca if you have questions, or would like to know more about this research.
+The Government of Canada purchases upward of $23 billion in goods and services annually. They currently use a procurement process that is labour-intensive for all participants. It is still in large part paper-based.
+Public Services and Procurement Canada (PSPC) is implementing an electronic procurement solution (EPS). This will transform the purchasing process for both government buyers and for the businesses from Canada and around the world who supply goods and services.
+The project is a multi-year initiative. It will see existing processes modernized through the implementation of a single platform based on two commercial solutions (SAP’s Ariba and Fieldglass applications).
+As part of the transition to the new EPS, PSPC is planning to implement a new entry point to replace the existing Buy and Sell website.
+The role of this new site (CanadaBuys) will be to support tasks like learning about government procurement, finding out how to become a supplier to the government, and searching for opportunities open for competitive bidding (also called tenders).
+Between April and July 2019, the Treasury Board's Digital Transformation Office (DTO) partnered with the PSPC team responsible for the design of the new CandaBuys site.
+We focused on the needs of businesses new to government procurement. We tested with small and medium enterprises not currently registered as suppliers to the Government of Canada.
+We did our baseline task performance testing on the existing Buy and Sell site.
+In the baseline study, 18 small and medium enterprises performed 158 tests. The most common problems participants encountered were:
+The CanadaBuys prototype was developed in Drupal. This was the first time the DTO team used Drupal instead of Github for prototyping. The advantage of Drupal is its WYSIWIG (what you see is what you get) interface. It allows more non-coders to contribute to content design and iteration.
+We developed the prototype design through a series of intensive collaborative workshops with PSPC procurement experts. This allowed us to have a prototype with several innovations including:
+The prototyped search experience used a single search box on its opening page.
+ +Heading text reads: CanadaBuys. There is a search field entitled "Search active Government of Canada opportunities". Below the search field is a link to "Changes to procurement - what current suplliers need to know".
+It then presented a simplified set of facet options for narrowing results.
+ +Facets listed in this order: Category and code, Location, Type of notice, Who can bid, Buyer.
+The options focused on considerations that would help suppliers to quickly eliminate irrelevant options.
+In the prototype, tenders were formatted to provide essential details in a single, easy-to-scan format. They used the “on this page” pattern to aid navigation. Again, the focus was on making it easy for suppliers to determine:
+Currently, most RFPs are published in PDF only. Businesses have to download the PDF before they can review or search within the document. The prototype experimented with a navigable HTML format for long documents. It used an interactive table of contents to allow people to easily access key sections like the statement of work.
+The prototype used a simple keyword search to help businesses identify the UNSPSC or GSIN code for a specific product or service. Results showed the full hierarchy for context, and let users navigate through the tree.
+ +Heading text reads: Internet or intranet client application development services. Below the UNSPSC code for this type of service is listed in bold text: 81111509.
+In a coloured box, the hierarchy for the code is shown:
+Knowing whether there is a supply arrangement that will allow your company to pre-qualify for government opportunities is one of the more complex aspects of learning how procurement works. The prototyped wizard made this easy. It asked simple questions and presented the best options for your good or service.
+ +The heading text for the wizard reads: Find the best way to compete for government contracts. It explains: We may have a preferred or mandatory method of supply for procuring specific types of goods or services. This is followed by a simple question: What do you sell or provide? The listed options are: Raw materials, Industrial equipment, components and supplies, end-use products, and services. Below is a button labelled Continue.
+This chart shows a comparison of task success rates between the baseline and validation tests for all 37 participants.
+ +Task | +Baseline | +Validation | +
---|---|---|
Tender search (landscaping) | +11% | +58% | +
Join the list of interested suppliers | +39% | +79% | +
Statement of work | +6% | +79% | +
Tender search (proofreading) | +41% | +89% | +
Remote work | +0% | +95% | +
Dates + standing offer | +11% | +53% | +
GSIN look-up | +29% | +47% | +
Supply arrangement | +22% | +47% | +
The features of the prototype that had the biggest impact on success rates were:
+If you’d like to see the detailed research findings from this project, email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+The Government of Canada purchases upward of $23 billion in goods and services annually. They currently use a procurement process that is labour-intensive for all participants. It is still in large part paper-based.
+Public Services and Procurement Canada (PSPC) is implementing an electronic procurement solution (EPS). This will transform the purchasing process for both government buyers and for the businesses from Canada and around the world who supply goods and services.
+The project is a multi-year initiative. It will see existing processes modernized through the implementation of a single platform based on two commercial solutions (SAP’s Ariba and Fieldglass applications).
+As part of the transition to the new EPS, PSPC is planning to implement a new entry point to replace the existing Buy and Sell website.
+The role of this new site (CanadaBuys) will be to support tasks like learning about government procurement, finding out how to become a supplier to the government, and searching for opportunities open for competitive bidding (also called tenders).
+Between April and July 2019, the Treasury Board's Digital Transformation Office (DTO) partnered with the PSPC team responsible for the design of the new CandaBuys site.
+We focused on the needs of businesses new to government procurement. We tested with small and medium enterprises not currently registered as suppliers to the Government of Canada.
+We did our baseline task performance testing on the existing Buy and Sell site.
+In the baseline study, 18 small and medium enterprises performed 158 tests. The most common problems participants encountered were:
+The CanadaBuys prototype was developed in Drupal. This was the first time the DTO team used Drupal instead of Github for prototyping. The advantage of Drupal is its WYSIWIG (what you see is what you get) interface. It allows more non-coders to contribute to content design and iteration.
+We developed the prototype design through a series of intensive collaborative workshops with PSPC procurement experts. This allowed us to have a prototype with several innovations including:
+The prototyped search experience used a single search box on its opening page.
+ +Heading text reads: CanadaBuys. There is a search field entitled "Search active Government of Canada opportunities". Below the search field is a link to "Changes to procurement - what current suplliers need to know".
+It then presented a simplified set of facet options for narrowing results.
+ +Facets listed in this order: Category and code, Location, Type of notice, Who can bid, Buyer.
+The options focused on considerations that would help suppliers to quickly eliminate irrelevant options.
+In the prototype, tenders were formatted to provide essential details in a single, easy-to-scan format. They used the “on this page” pattern to aid navigation. Again, the focus was on making it easy for suppliers to determine:
+Currently, most RFPs are published in PDF only. Businesses have to download the PDF before they can review or search within the document. The prototype experimented with a navigable HTML format for long documents. It used an interactive table of contents to allow people to easily access key sections like the statement of work.
+The prototype used a simple keyword search to help businesses identify the UNSPSC or GSIN code for a specific product or service. Results showed the full hierarchy for context, and let users navigate through the tree.
+ +Heading text reads: Internet or intranet client application development services. Below the UNSPSC code for this type of service is listed in bold text: 81111509.
+In a coloured box, the hierarchy for the code is shown:
+Knowing whether there is a supply arrangement that will allow your company to pre-qualify for government opportunities is one of the more complex aspects of learning how procurement works. The prototyped wizard made this easy. It asked simple questions and presented the best options for your good or service.
+The heading text for the wizard reads: Find the best way to compete for government contracts. It explains: We may have a preferred or mandatory method of supply for procuring specific types of goods or services. This is followed by a simple question: What do you sell or provide? The listed options are: Raw materials, Industrial equipment, components and supplies, end-use products, and services. Below is a button labelled Continue.
+This chart shows a comparison of task success rates between the baseline and validation tests for all 37 participants.
+ +Task | +Baseline | +Validation | +
---|---|---|
Tender search (landscaping) | +11% | +58% | +
Join the list of interested suppliers | +39% | +79% | +
Statement of work | +6% | +79% | +
Tender search (proofreading) | +41% | +89% | +
Remote work | +0% | +95% | +
Dates + standing offer | +11% | +53% | +
GSIN look-up | +29% | +47% | +
Supply arrangement | +22% | +47% | +
The features of the prototype that had the biggest impact on success rates were:
+If you’d like to see the detailed research findings from this project, email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+Summaries of the research the Digital Transformation Office (DTO) and federal departments undertook to make it easier for people to find and understand Government of Canada information and services on Canada.ca.
+This optimization project was undertaken by the Digital Transformation Office (DTO) in close collaboration with Service Canada’s Employment Insurance (EI) delivery and policy teams, and the Employment and Social Development Canada (ESDC) web team. Our goal was to make employment insurance maternity and parental benefits easier to find and understand for Canadians planning to expand their family.
+During our discovery research, we looked at a lot of external websites. We wanted to know what Canadian parents were sharing about maternity leave, parental leave, and navigating the EI system. This helped us understand which aspects of accessing these benefits people find most complex.
+We found that people struggled with:
+Search analytics told us that searches using the keyword “leave” far exceed searches for “benefits.” This suggested that these concepts, although legally distinct, are not clearly differentiated. Leave is a right defined in the Canada Labour Code and in provincial employment standards acts. While on leave from work, EI benefits provide a sum of money to Canadians who have paid EI premiums. This sum represents a proportion of previous earnings.
+When we spoke with EI service agents, we realized that many parents don't understand that they can't switch between benefit options. For example, parents may choose the extended option, planning to have more time at home. This provides 33% of earnings. Some parents later change their minds and want to switch to the standard option, which provides 55% of earnings over a shorter period. They don't understand that once they've chosen, the choice is final.
+An extra challenge for this optimization effort was that we had to explain a new parental sharing benefit. This new “use it or lose it” benefit was set for launch in March of 2019. It is available only to partners who share EI parental benefits. It's tricky to understand the options available to each parent, and how one parent’s choices impact the other’s benefits.
+From previous research, we knew that tasks that require people to do math are particularly challenging. Concepts and terms such as “hours of insurable employment” and “variable best weeks” are not self-explanatory. They make the task of figuring out a benefit amount more difficult.
+We worked with the EI team to define scenarios for testing. The 8 scenarios covered a range of circumstances (from high-risk pregnancy to adoption, etc.). We asked common questions about the benefits and the process for getting them. We tested with 22 participants between the ages of 25 and 39. All were employed full-time, and none had applied for any form of EI within the last 5 years.
+Out of a total of 155 task trials across the 8 task scenarios, participants found the right page 46% of the time. Participants succeeded at the tasks only 28% of the time.
+After analyzing the video evidence from baseline testing, we developed a prototype in GitHub. We used the same service initiation template that we used in the EI caregiving benefits optimization project.
+Through an intensive series of workshops with EI subject matter experts, we iterated the prototype. Key design features included:
+We continued to iterate the prototype throughout testing. We observed each session closely to identify where people struggled, and we implemented fixes. This approach significantly improved the design and the testing results.
+Our target was to improve both findability and task success by a minimum of 20 percentage points, or to exceed 80% for both measures. In the validation round, we tested with 20 participants for a total of 167 task performance tests.
+This chart shows the baseline measurement at the start of the project compared with the validation measurement on the prototype redesigned by project team
+Task | +Baseline | +Validation | +
---|---|---|
When to apply | +33% | +75% | +
Maternity/sickness | +22% | +80% | +
Insurable earnings | +62% | +89% | +
Max leave | +5% | +79% | +
Variable best weeks | +32% | +53% | +
Change parental benefit | +24% | +85% | +
Bonus included in earnings | +0% | +69% | +
When benefit payments end | +44% | +94% | +
We reorganized and regrouped content so that the content of each page was clearly related to the page heading.
+To improve scannability, we added headings, and removed extraneous information. We hid technical details using the expand-collapse pattern. This kept pages from appearing overly complex. We guided people to EI-specific concepts and terms (such as “best weeks”), explained them clearly, and used them sparingly.
+We designed a simple estimator, allowing people to see the impact of their choices. The estimator requests minimal inputs to provide a detailed, useful answer that can help people make better-informed choices.
+Both eligibility requirements and benefit entitlements are based on numbers of weeks. We took great care in presenting numbers to help people successfully differentiate between these.
+The image first shows a section of the original page content with red highlighting around a statement about a maximum of 15 weeks of EI maternity benefits. There is red highlighting around a second statement about a maximum of 61 weeks.
+Below this is a second image from the prototype version of the same content. There is green highlighting around a simple math equation. The equation shows 15 weeks maternity plus 61 weeks of extended parental equals 76 weeks total for Janelle.
+If you’d like to see the detailed research findings from this project, email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+The Recalls and safety alerts optimization project kicked off in the spring of 2018. In this project the team was looking at food and product recalls and safety alerts on Canada.ca. The goal was to make this content easier for Canadians to find and understand. Designers and researchers from the TBS Digital Transformation Office (DTO) partnered with Health Canada (HC), the Canadian Food and Safety Inspection Agency (CFIA), Transport Canada (TC), and Environment Canada (EC). Together we formed a multi-disciplinary team with members from web communications, strategic communications, program areas and IT.
+At the outset of the project, CFIA reported that 89% of visitors to their homepage were using mobile. They also reported that 83% of the traffic to their All Recalls page came from Facebook. Given this reality, the project focused on mobile users first.
+The design of the search filters is what made this project unique. The team took inspiration from industry leaders like Amazon to build a search interface that was tight, intuitive, and easy to use. We made the filters more obvious so that people could easily see and tap them on a mobile device. Like Amazon, we also used auto-complete in the search field to show possible filters that could help narrow the search.
+Together the team developed a set of user stories. These reflected the needs of real Canadian parents and what they try to do on Canada.ca. The stories followed a set structure:
+The team conducted baseline usability testing to understand how the live site was performing. We wanted to see how Canadians currently navigate through the site and what issues they face when trying to complete basic tasks.
+We started by designing typical tasks. We consulted our user stories to make the tasks realistic. Next, we recruited 19 Canadian parents from across Canada with a child 12 years old or younger. This is a typical profile of Canadians who check recalls online the most.
+The baseline testing consisted of 17 English-language sessions on mobile phones and 2 French sessions where participants completed their tasks on their computers. In each test session, participants completed 12 tasks.
+Participants used the live Canada.ca website to complete their tasks. The testing measured:
+At the end of baseline testing, the overall findability rate across the 12 tasks was 51%. The overall success rate was 52%.
+The team recorded the baseline usability testing sessions so we could make observations using clickpaths, screenshots, and compelling quotations taken from the videos. We watched select usability testing videos as a group to understand key issues and user behaviours.
+Issues identified in baseline testing:
+The team identified the main issues. We held workshops to rearrange and rewrite content and look at possible design solutions.
+We developed a working prototype in GitHub. Before validation testing, we did some guerilla testing to make sure our solutions worked. This testing helped identify areas in the prototype needed further improvements.
+The prototype offered the following solutions:
+Image of two phones, labelled "Before" and "After".
+The first phone shows how the original Recalls content displayed on mobile. .
+The second phone shows how the redesigned prototype text is much shorter and has more white space. You can see that 3 bullets immediately tell you the product, the issue and what to do. An arrow points to the text with the annotation "Answers not information - Highlight what to do".
+After ironing out any issues from the guerilla testing, the team began the validation round testing. Validation testing is how we confirm if our changes solved the problems uncovered in the baseline testing. 17 new participants completed the same 12 tasks on the redesigned prototype.
+The goal for the validation round of testing in all optimization projects is either 80% success, or an improvement of at least 20 points over the baseline score.
+These targets were exceeded:
+This chart shows the task completion success rates across the baseline and redesigned validation test on the prototype for all 34 participants:
+ +Baseline measurement at start of project, validation on prototype redesigned by project team.
+Task | +Baseline | +Validation | +
---|---|---|
1. Google food recalls | +88% | +100% | +
2. Eggs allergen foods | +63% | +94% | +
3. Search roast beef | +29% | +82% | +
4. Related Buckley's recall | +44% | +100% | +
5. Car seat recall Britax | +50% | +67% | +
6. EpiPen shortage | +71% | +88% | +
7. Vehicle search 2003 Honda Pilot SUV | +76% | +94% | +
8. Stella Artois | +65% | +75% | +
9. Recalled children's toys | +88% | +100% | +
10. Lettuce closed notice | +31% | +63% | +
11. Peanut allergen food | +31% | +94% | +
12. Skip Tuo convertible high chair | +25% | +94% | +
The team derived this set of 4 design principles that appeared to have the most impact on improving success rates:
+If you’d like to see the research findings from this project, let us know. Email us at dto.btn@tbs-sct.gc.ca.
+Let us know what you think about this project. Tweet using the hashtag #Canadadotca.
+In October of 2019, the Treasury Board’s Digital Transformation Office (DTO) identified an opportunity to follow up on a previous optimization project from 2018: Health Canada’s recalls and safety alerts management system (RSAMS).
+In the first project, we had created a prototype search using the Canada.ca search engine. Results from that project validated many of the hypotheses that had informed the prototype’s design, but the search functionality was limited. This meant that the prototype couldn’t be deployed to the RSAMS environment that was in place at the time.
+For this second project, the team was able to experiment with an open source search solution. We also added a search expert to the project team. The intent was to help Health Canada prepare for implementing search functionality in a new Drupal-based RSAMS publishing infrastructure, to be launched in the 2020 to 2021 fiscal year.
+For the DTO, our goal was to learn more about delivering effective specialized search, so we could provide advice and guidance to other GC institutions offering similar search functionality to their users.
+To help us experiment with features and options, we created a search testbed. We got the data for the testbed’s index using a combination of a database extraction (from RSAMS) and open data sets for vehicle recalls, published by Transport Canada on open.canada.ca. The testbed allowed the project team to experiment with a range of options for configuring the search solution and interface. This included features such as facets and filters, refinement mechanisms and feedback, auto-suggestion (typeahead), query correction, highlighting, and more. Applying these features to real data was extremely helpful.
+When we started the project, we planned for one round of testing. We planned to reuse tasks and baseline data from the first project, and compare task performance to complete the study. Our optimization studies require a minimum of 16 performances of each task for reliable comparison.
+Implementing a full search solution within a constrained time frame is a complex task. We found ourselves implementing new features, updating the index, and making significant changes to the product as we tested. This was fantastic for learning and experimenting. It is exactly what should be done in a product development process. The drawback was that it meant we didn’t have our minimum of 16 task performances, so we couldn’t use the test data for comparison.
+Specialized search is different from web search. It focuses on queries of a specific structured database, or a collection of related content, which may include both structured and unstructured content sources.
+Unlike web search, all components of the search system should be within the control — or at least the influence — of the product manager. To help institutions deliver and maintain an effective user experience, we wanted to define these various components and the skills and activities required to support them.
+A full search solution combines multiple components, each of which involves design decisions.
+At the end of the project, we gave Health Canada recommendations for the new RSAMS search implementation.
+Our experience with the RSAMS project reinforced our understanding of the complexity of delivering effective specialized search. The key recommendations, summarized here, apply to any specialized search implementation.
+It is important to look at the evidence of how people are using specialized search and design the solution to support this.
+For example, earlier research had shown that RSAMS search behaviour follows distinct patterns:
+A shared understanding of these patterns can help determine how to structure your content. For example, the RSAMS data did not capture brand names or product codes as a separate field. That undermined precision in search results.
+For RSAMS, a significant area of challenge has been that the content is sourced from multiple groups and institutions. Each has its own processes, templates and formats for publishing recalls and other notices related to the specific products it is responsible for monitoring.
+For this project, we had a static dataset. That meant we could make manual additions and deletions to prepare a single, search-specific data source for indexing. Changes made to the data included:
+This “post-processing” of search data prior to indexing was labour-intensive. It was also difficult because of inconsistencies and issues with data quality.
+Ideally, these types of issues should be fixed at the source, rather than patched afterwards. Getting this right requires bringing together the people most knowledgeable about the data or content.
+Is your search effective? To answer this question, you need to define metrics for search performance and perform routine analytics.
+You need to continually monitor these metrics. They should drive improvements to the relevant search solution component.
+If your search solution aggregates data and content from multiple sources, data curation is essential. The publishing infrastructure (content management system) should enforce structural consistency, and impose constraints (as in the use of controlled vocabularies for tagging content). Content needs to be monitored for quality. This monitoring should, in turn, inform improvements to structure, guidance and shared assets (such as those controlled vocabularies).
+An effective search solution requires dedicated resources. It needs to involve multiple skill sets in its design and maintenance. This should include subject matter experts, information architects, content designers, developers, data and network specialists, search analysts, and user researchers.
+To quote our search expert:
+“Search is hard. Good search is harder.”+ +
If you’d like to see the detailed research findings from this project, email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+In 2016, the Government of Canada supported 490,000 full-time students with a total of $2.7 billion in loans. Many students begin or manage their student loan journey through the Student loans and grants pages on Canada.ca. In the fall of 2018, we began a collaborative project to improve the success of students using those pages. The project team included members of the Canada Student Grants and Loans Program team, the Employment and Social Development Canada (ESDC) web team, and the Digital Transformation Office from Treasury Board.
+We limited the project to the Canada.ca pages related to student aid. The National Student Loans Service Centre (NSLSC) is a separate website to which students must sign in.
+While the Government of Canada funds student loans, it is the provinces that manage them. Students use the provincial sites to apply and re-apply. They may never have any interaction with Canada.ca at all. It is not usually until they need to repay, or need help repaying, that they interact with any federal pages.
+For the task of repaying a student loan, the “Maintain your student loan" page on Canada.ca was one of the most used pages. 87% of the visitors arrived via Google. From there, 59% proceeded to the login page of the NSLSC site.
+In the spring of 2018, ESDC ran a usability test of some of the top student task scenarios. We were able to analyze that data to use a group of task scenarios as our baseline task performance measure. That study showed that navigating to the content was a real challenge. There were so many layers of topic pages that it took many clicks to reach the actual student loans pages from the home page.
+Another challenge was the way content for grants and loans was divided. Because different program teams manage this content, they divided it into separate pages. This didn’t make sense for students though, as they apply for both loans and grants in a single application form.
+Testing over several years had showed that students could find this single application form about 60% of the time. This project gave us a chance to tackle integrating grants and loans content. This let us provide a clear path to answers by removing layers of confusing program-centric information.
+The project team focused on getting rid of the many pages of program information that didn’t help students with the answers they were looking for. In design workshops we physically cut up topic pages and discarded or moved topics around. This left us with a new, smaller set of pages. Then, we took the stripped-down content and focused on directing students to the provinces to apply, and to the NSLSC to manage their loans.
+We looked at the analytics and previous studies. These showed that students used the Jobs, Benefits, and Money themes to get to content about student loans and grants. We left those routes in place, but we decluttered and shortened them.
+We grouped loans and grants together in the new design, instead of leaving them in separate program sections. We moved program descriptions to the NSLSC site.
+The baseline study showed that it was hard for users to find the NSLSC site and to understand what they could do there. People often seemed to forget the complex name of the National Student Loans Service Centre program. There were many searches on Google and Canada.ca for variations of ‘NSLSC’. To address this problem, the team used the term in the title of the new page: Manage your student loan at the NSLSC. We also added a single primary button to direct students to login to their NSLSC account.
+The team did research to understand how students were searching for information. They looked at Reddit posts, search data from within Canada.ca and Google search queries and trends. They used what they found to update content so students could find and understand it better.
+Before | +After | +
---|---|
Student grants and student loans | +Student grants and loans | +
Student financial assistance | +Student aid | +
Grace period for six months | +6 month non-repayment period | +
One-time payment | +Lump-sum payment | +
Overall task performance went up from 61% success to 88% when we retested the task scenarios on the prototype.
+This chart shows the baseline measurement at the start of the project compared with the validation measurement on the prototype redesigned by project team.
+ +Task | +Baseline | +Validation | +
---|---|---|
Canada Apprentice Loan | +80% | +88% | +
Part-time | +72% | +100% | +
Loan-grant | +61% | +88% | +
Update banking information | +15% | +82% | +
Paying back student loan | +57% | +100% | +
Repayment assistance | +86% | +100% | +
To better serve Canadians, teams have to remove their program information and focus on providing answers and service. For student loans, that meant guiding users to the provinces and the NSLSC. The Canada.ca pages are only there to fill the answer-gaps between those service points.
+If you’d like to see the detailed research findings from this project, email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+The Tax filing optimization project kicked off in November 2017 with the goal of making it easier for Canadian businesses to fulfill their GST/HST and payroll obligations. Designers and researchers from the TBS Digital Transformation Office formed a multi-disciplinary team with the Canada Revenue Agency. The team included representatives from CRA’s user experience, web publishing, GST/HST and Payroll programs, as well as call centre agents who have provided direct assistance to Canadians.
+The team began by researching tasks on Canada.ca, using existing data sources.
+The CRA conducts a monthly top task exit survey to find out more about how citizens use their digital services. It asks people who come to CRA pages what they were trying to do and whether they were successful. “File a GST/HST return” and “Calculate payroll deductions” are consistent top tasks.
+Web traffic data showed that the pages for calculating payroll deductions and reporting payroll got over 4.2 million visits in 2017. The topic of GST/HST returns had over 1.3 million visits. Over 88% of people do these tasks on desktop or laptop computers, not mobile devices. This was helpful input for our test planning.
+We analyzed the readability levels of the GST/HST and Payroll pages. Analysis confirmed quantitatively what people know intuitively: web pages about tax obligations are quite hard to read. About 24% of the text on key GST/HST and payroll pages used long sentences, and 17% was written in passive language. The Canada.ca Content Style Guide and plain language experts recommend using short sentences and writing in the active voice.
+We were starting to understand the issues Canadians were facing when trying to fulfill their business tax obligations online. The next step was to confirm our hypotheses with remote moderated usability testing.
+To test our hypotheses, the team developed 16 possible tasks related to filing business taxes. We created a simple online survey so that CRA’s call centre agents and subject matter experts could comment on the tasks. With this input, we narrowed down the list to 10 task scenarios, confident that they were accurate and represented real issues for citizens.
+We used remote moderated testing on desktop and laptop computers to establish the baseline. We tested 17 participants from across Canada (14 English and 3 French). We recruited owners of small businesses, screening for those who were self-employed or sole proprietors, without prior GST/HST or payroll experience.
+Participants were asked to complete all 10 tasks in one hour. If they took longer than 5 minutes to complete a task, we counted that as too long to be considered successful and had them move on to the next question.
+The order of the scenarios was carefully chosen to avoid learning effects, and half the participants were given the tasks in the reverse order. Some tasks had people start on the Canada.ca home page, while others started on the CRA institution page, or on the GST/HST or Payroll topic pages on Canada.ca. An expert moderator conducted the testing.
+During this first round of testing, we convened observation sessions to watch videos of the usability testing. These sessions provided an opportunity for people to better understand how usability testing works. The CRA experts who wrote the web pages saw real people using them, and typically found it eye-opening.
+The results of the baseline test showed that on average, participants could find the right information 62% of the time. However, they were only able to successfully complete the tasks 48% of the time.
+For the prototyping phase, we divided into two working teams, one for GST/HST tasks and the other for payroll tasks. Each team combed through the baseline test report, watched more videos of participants trying to complete their tasks, and looked at the click paths — the sequences of pages that participants went to. This helped show which pages were causing people to go off the “ideal” path. Sometimes this was because the correct link wasn’t clear enough. Other times a different link was more appealing.
+One of the primary techniques for improving citizen success was to take long pages of text, and separate sub-tasks or topics into steps. This pattern worked because it kept related information together, while allowing people to see the sequence of activities, and jump to the section that best suited their needs.
+Two webpages are shown side by side. The page on the left is labelled "Baseline" and shows that the “How and when to remit source deductions” webpage on Canada.ca was extremely long when the project started. An arrow points to the webpage with the annotation "Page was too long for people to find their answer."
+The page on the right is labelled "Redesign” and shows the “How and when to pay (remit) source deductions” page with 5 steps: 1. Overview, 2. Due dates, 3. Make a payment (remittance) 4. Confirm payment (remittance) received and More information. An arrow points to the steps with the annotation "Tasks grouped into steps made them easy to scan."
+The image shows the much simpler and shorter content of tab 3. Make a payment (remittance). At the bottom of the page is a button with the text “Make your payment”. An arrow points to the button with the annotation "Buttons draw attention to the main action on the page."
+We also found that people who were not familiar with tax rules did not always grasp the term “remit”. They often skipped over links to pages with that content. We could not completely replace the term remit, since it has legal connotations. Instead, we chose to bridge citizen and technical terminology by using both “pay” and “remit” in links and page titles.
+Once the revised design was ready, 25 new participants were recruited to complete the same 10 tasks. We used the same methods as the baseline test to ensure the tests could be directly compared. Our target was either 80% success, or an improvement of at least 20% points over the baseline score.
+Findability rates for the revised content rose from 62% up to 90% (+28% points). Task success rates rose from 48% to 76% (+28% points).
+Baseline measurement at start of project, validation on prototype redesigned by project team.
+Task | +Baseline | +Validation | +
---|---|---|
1. Payroll: Pay deductions by Interac | +59% | +73% | +
2. GST: Invoice date | +41% | +78% | +
3. Payroll: Remittance error – fine | +14% | +42% | +
4. GST: Place of supply | +27% | +67% | +
5. Payroll: Confirm cheque received | +29% | +83% | +
6. GST: Instalments | +24% | +60% | +
7. Payroll: When to remit | +76% | +96% | +
8. Payroll: Max CPP contributions | +80% | +83% | +
9. GST: Remit date (first time) | +82% | +74% | +
10. Payroll: Nil remittances | +47% | +100% | +
42 total participants
+The team summarized the 6 core changes that had the most impact on improving success rates:
+If you’d like to see the research findings from this project, let us know. Email us at dto.btn@tbs-sct.gc.ca.
+Let us know what you think about this project. Tweet using the hashtag #Canadadotca.
+The travel advice and advisories optimization project took place from January to March 2019. The goal was to make it easier for Canadians to find and understand the travel advice and guidance available on Canada.ca. The project team was co-led by the Treasury Board Secretariat Digital Transformation Office (DTO), and Global Affairs Canada (GAC).
+It also included GAC’s partners in delivering travel-related content:
+Analytics told us that Canadians increasingly use their mobile devices to access travel information on Canada.ca. For this reason, both baseline and prototype testing focused on mobile use. Test participants either used their own device, or resized their desktop browser to simulate a mobile interface.
+Travelling has a degree of risk. Most travellers take routine precautions against familiar risks, like theft or injury. The Government of Canada can’t prevent Canadians from travelling, but has an obligation to protect them from harm. We inform Canadians when they should change their routine behaviour or avoid travel altogether to stay safe. It’s critical that we design web pages that clearly and succinctly convey both the nature and the level of risk. To do this, the project team designed a new page template for destinations. The design prioritizes "must know" risks using a new alert pattern and a carefully curated short list of important points.
+There are many pages in the travel section of Canada.ca that provide general advice. This includes guidance on travel documents, travelling with children or pets, travel insurance and vaccinations. Often destination-specific pages duplicated this content. Testing uncovered a persistent challenge where people found general advice and assumed they had all the information they needed to make decisions about travelling. They were missing critical destination-specific advice. To address this we removed duplicate "generic" advice from destination-specific pages. We linked to it instead. We also grouped content differently on the navigation. We increased the visibility of topics such as "Planning your trip" and added a "Before you go" checklist. These changes helped people navigate more effectively between both types of advice.
+In the baseline study, 19 Canadian travellers performed 147 tests on the live Canada.ca travel pages. The overall success rate was 23%.
+The most common problems participants encountered were:
+Intensive workshops with the extended project team helped us to deliver a prototype with significant improvements.
+These included:
+The team tested the redesigned prototype with a new set of 19 Canadian travellers. They performed 146 tests of the same task scenarios as in the baseline testing. Our target was either 80% success or an improvement of at least 20 points over the baseline score. The revised content and design improved findability rates from 49% to 88% (+39% pts). Overall task success rose from 23% to 72% (+49% pts).
+This chart shows a comparison of task success rates between the baseline and validation tests for all 38 participants.
+ +Task | +Baseline | +Validation | +
---|---|---|
Idonesia + codeine | +17% | +53% | +
Malawi + advice | +21% | +84% | +
Spain + lost passport | +37% | +53% | +
Cuba + health insurance | +16% | +63% | +
Cayman Islands + hurricanes | +32% | +78% | +
Travel checklist | +11% | +100% | +
London appointment | +12% | +58% | +
Costa Rica + yellow fever | +41% | +71% | +
Cambodia + advisory | +21% | +37% | +
The features of the prototype that had the biggest impact on success rates were:
+If you’d like to see the detailed research findings from this project, email us at dto.btn@tbs-sct.gc.ca.
+Tweet using the hashtag #Canadadotca.
+In recent years, Canada has seen an unprecedented number of fraud casesFootnote 1. Many of these cases target their victims through websites that attempt to look like official sites. This makes it more important than ever to ensure that users can recognize official Government of Canada (GC) websites.
+Consistency in design is a crucial element to help foster trust among users. Our Wayfinding research project suggested that consistent design elements across pages increased trust that users were on official Government of Canada sites.
+The Digital Transformation Office (DTO) conducted a Trust study to better understand which Government of Canada design elements are more strongly associated with trust and confidence in the brand. The study measured and compared respondents' trust in the following:
+The findings demonstrated that the Government of Canada signature, that includes the flag symbol, was essential to the brand and that the use of the red flag provides the highest level of trust. The Canada.ca domain, as well as a consistent Sign in pattern were also identified as being important to user trust. In contrast, the theme menu button was not an essential design element to user trust.
+We conducted an interactive trust exercise from June 9 to 22, 2022. The exercise had 2,726 respondents, recruited via an invitation on Canada.ca. There were 4 versions of the study across both official languages and a mobile vs computer layout. 48% of participants answered from within Canada. 78% responded in English, and 73% completed all the questions.
+For each question, people were shown 2 images with slight variations of brand elements. They were asked to identify which image they found more trustworthy. If their level of trust was the same for both images, they could respond with "same".
+Before fully launching the study, we piloted it with 35 people. These respondents explained their answers via screen-recording sessions. This qualitative stage served both to refine the study design and to get a detailed understanding of rationales and drivers for particular choices. We were particularly interested in people's ability to differentiate between trust and preference.
+These early answers showed that while there is some overlap between trust and preference, most understood the difference. Respondents described their trust choices most often with terms like "safe", "official", "professional". In the closing comments section, they often mentioned the red flag symbol in the upper left corner.
+The study showed that the Government of Canada signature with a red flag symbol and black type, on a white background, has the highest level of trust of the Canada.ca brand. The vast majority of respondents trust this version more than any other version of the GC signature.
+Only 8% of respondents chose the page without the flag as trustworthy. The flag was often mentioned, with comments such as:
+When presented with the official FIP flag symbol in red and in black and white, participants trusted the coloured version of the flag more. According to the study:
+These results were consistent across age and type of device.
+The study found that specifically, the background of the image should be white, ideally even in dark mode, to make the flag stand out. The black and white version seemed to hit a nerve with respondents, as it was often mentioned in the closing comments. We received comments such as:
+The study asked respondents to choose between URLs using the Canada.ca domain and the gc.ca domain. 60% of participants trusted the Canada.ca URL more than a similar .gc.ca URL. The Canada.ca URL is particularly important to the many people who've learned they should trust URLs more than the design.
+We found that the menu button was not essential to user trust. When shown designs with and without the menu button, 41% of respondents chose "same". This demonstrated that they didn't notice the difference, or were equally trusting, in both designs. This was the only question in the study where more people chose "same" more often than one of the images.
+Signing in to an account is sometimes associated with phishing and spoofing so we wanted to determine what sign in design is more trusted. We included a study question which displayed images of an account sign in page. One version had a familiar Canada.ca green "supertask" button. The other version had unique blue and red buttons. 43% of the respondents trusted the page with the green buttons. 37% trusted the page with unique buttons.
+Respondents mentioned the importance of consistency and familiarity for sign in pages. They expect a consistent design for the Canada.ca brand to help them avoid fraud.
+In respondents' final open-ended comments, there were many mentions of desiring a better user experience. Users want Canada.ca to be simpler and more functional. We encourage departments to continually improve their web presence using research and evidence. Tools such as the GC Task Success Survey and the page feedback tool can help you gather evidence and decide what to improve next.
+GC Task Success Survey: The GC Task Success Survey is a continuously running web intercept survey that asks questions related to three key elements: task success, ease, and satisfaction.
+Page feedback tool: This tool allows users to provide feedback for the page they are on. It can help departments uncover common issues in their content.
+As a result of this research, we updated the global header guidance in the Canada.ca design system to reflect the importance of the red FIP flag symbol and to provide a consistent spot for the Sign in button. We also continue to work with departments and agencies to ensure full adoption of the Canada.ca design, including the Canada.ca domain. In addition, we're collaborating with departments and agencies to develop a consistent sign in pattern.
+Date: December 21, 2022
+Government departments and agencies are adopting the Canada.ca design to enhance trust and ease of use through a consistent experience. The design includes core brand elements and navigation patterns. Users need to be able to quickly recognize official government information and services to avoid fraudulent sites. They also need intuitive patterns to navigate and get their tasks done.
+One of the core brand elements is the global header, which includes a theme and topic menu. This menu has several pain points. The Digital Transformation Office (DTO) conducted a set of studies to understand the role and impact of the menu on in-site navigation.
+Wayfinding is how users assess where they are in a website and plan a route to follow. On Canada.ca, the menu is a way for users to navigate through the site. It displays top-level pages (themes) to help users find answers and services.
+While many visits start from a search engine result, the website should support navigation no matter where a user enters the site. This is especially important when they start their journey on the wrong page.
+Some departments have expressed concerns about adopting the Canada.ca design as the menu would compete with their existing local menus or not provide any navigational benefit. For example, Statistics Canada has a statistics menu for the same themes that are present in the Canada.ca menu. The benefits that Veterans Affairs Canada offers have similar titles to those found through the theme menu, but they're different benefits.
+We worked with these and other departments to explore navigation and design options to facilitate the adoption of the Canada.ca design.
+The goals of this extensive research study were:
+We conducted 2 large rounds of task performance usability testing and used web analytics to inform our design sprints. Just as in the studies, the analytics showed that breadcrumbs are used more than the menu:
+This suggested that it would be possible to replace the menu without affecting task success. We also noticed that people often used the menu to sign into accounts. This suggested that creating a different sign-in route could further reduce the use of the menu.
+The initial study in 2019 established a baseline measurement of the top tasks for each participating department. The research compared sites with the Canada.ca design against sites from departments with different designs. At that time, Immigration, Refugees and Citizenship (IRC) had adopted the Canada.ca design, whereas Agriculture and Agri-Food Canada (AAFC), Veterans Affairs Canada (VAC) and Statistics Canada (StatCan) had not.
+20 participants tested 12 tasks from those departments to understand pain points with the menu and give us ideas for potential solutions. The participants included:
+The study involved typical tasks users complete on each departmental site, including a scenario that started on a Google search results page.
+Task | +Scenario | +
---|---|
Start on a Google results page with links to Canada.ca URLS and to other sites (trust task) | +Your friend from France won't need a visa to come visit but will need an Electronic Travel Authorization. How much will it cost to apply? | +
Start on a page with the Canada.ca design + Immigration Refugees and Citizenship (IRCC) page to study permit (department navigation task) |
+ Your Egyptian friend wants to come to a Canadian university this year. How much will it cost to apply for a study permit? | +
Go to IRCC to check status (Canada.ca design - Sign in task) | +Yen applied for a permanent resident card and needs to find out if it has been mailed yet. Find a page where she can enter her user ID and password to get into her Immigration account. | +
Agriculture topic for temporary foreign workers (no Canada.ca design navigation between themes) | +You finished researching some ideas for the next farming season. Now you need to find out if there is a special way to hire temporary seasonal farm workers from Mexico. | +
Agriculture to agriculture topic (no Canada.ca theme navigation) | +Your friend is going to start growing wheat this spring in Saskatchewan. Is there crop insurance available there? | +
Agriculture to agriculture topic (no Canada.ca theme navigation) | +That friend wants to understand the market for organic wheat outside Canada. Are there any 2019 reports with a trade overview and consumption trends that might help them? | +
VAC (interdepartmental task) | +Nour is an injured veteran working out her family budget. How much can they expect to receive if her spouse Pat is recognized as her daily caregiver? | +
VAC to Child Benefit (CRA) + (inter-theme navigation) |
+ Like all Canadian parents, Pat and Nour also get monthly Canada Child Benefit payments for their 15 year old son. Will they still get payments next month when he turns 16? | +
VAC caregiver to account (no Canada.ca sign in experience) | +Nour just switched to a new bank. Is it possible for her to change her bank information online somehow for her veterans payments, or would she have to call? | +
StatCan interdepartmental navigation + (removed in Comparison) |
+ You searched and found these soybean reports but now you need recent numbers. Find estimated soybean production for 2019 in Canada. | +
StatCan interdepartmental navigation + (removed in Comparison) |
+ You have a voice message from someone claiming to be a Statistics Canada interviewer. They say your household was selected to complete the Labour Force Survey. Find the special number for survey participants to call so you can check if it's a real survey. | +
Compare with UK to evaluate design options + Gov UK Student visa cost + (removed in Comparison) |
+ Zak has been accepted into a two-year diploma program in the UK. How much will it cost to apply for a student visa? Start on a Gov.UK page to compare the experience there to the similar task on Canada.ca (for research only, not a measurement task). | +
The baseline study gave us insights for solutions to the problems participants experienced, as well as findability and task success rates so we could evaluate proposed designs. As expected, participants had trouble switching between themes on sites not using the Canada.ca design. Another pain point this study highlighted was the lack of a conventional Sign in button.
+To test ways to replace the menu, we built:
+Analysis of footer usage patterns on the home page versus all other pages revealed an opportunity to reduce the set of GC-wide footer links to 3 (“All contacts,” “Departments and agencies,” “About government”). This freed up space to move the theme page links from the menu into the main band of the footer.
+The new footer design also added an optional contextual band for “Contact us” and other contextual rescue links.
+We also created a new layered theme page template that could be accessed through both the breadcrumbs and the footer.
+The layered theme template has a navigation bar on the left and a Most requested band on top. Topics for the theme are below the Most requested band.
+Users commonly expect to find a Sign in button at the top right of the page. This is standard across many websites. It’s become a web convention. We added one to support this top task.
+Once various pieces of the new design came together, we launched another study. This one tested the effectiveness of the prototype and let us finesse the implementation strategy.
+In contrast to other optimization projects where large improvements were expected, in this project we were looking to maintain or, hopefully increase, task success despite removing the menu and changing to the Canada.ca design. We achieved our goal.
+18 participants tested 8 tasks from IRCC, AAFC and VAC. We compared their results with the results from the baseline study. We used prototypes with the new designs we had created. We didn’t include tasks from StatCan in this round because we needed more time to experiment with ways to address their need for a local navigation solution. We wanted to focus first on improving global navigation, since that was less complex. We also added 4 new tasks to evaluate and enhance the new designs.
+Test results | +Baseline 2019 | +Comparison 2021 | +
---|---|---|
Users that found the correct answer (overall task success) | +54% | +61% | +
Users that found the correct page (findability) | +61% | +66% | +
Applying the Canada.ca design to departmental sites with other designs maintained or improved success. In particular, for the task that required navigating from Veterans to Child Benefits, there was a big improvement. For the other tasks that required navigation inside the same theme, the comparison results were similar to the baseline.
+The study showed that it’s possible to remove the menu if it’s replaced by the new footer, the Sign in button, and a layered theme page design. This is a significant change that requires a transition strategy.
+Read our blog post to know how we are rolling this change out.
+Wayfinding research project improves our approach to navigation on Canada.ca
+We maintained task success and helped enable more departments to adopt the Canada.ca design.
+Connect with us if you have questions or would like to see the detailed research findings from this project or the trust study.
++ +
+The Weather optimization project kicked off in July 2017. Environment and Climate Change Canada’s Weather Office pulled together a collaborative team from E-Communications, and the Meteorological Service of Canada. Combined with the designers and researchers from the Digital Transformation Office, we set out to make it easier for Canadians to understand notifications about dangerous weather conditions.
+We started with a discovery phase to understand the current situation, people's needs related to the current content, and problems they may be having. Sources included reviewing previous usability studies, analyzing website traffic patterns and search behaviours, and reviewing phone and support requests. This provided enough insight to determine target audiences, which we then used to talk to people affected by the current content. We also researched weather applications and how they are used.
+We also found tt most people were accessing the content using mobile devices. People were checking daily weather forecasts, long-term forecasts, or extreme weather warnings on their smartphones.
+From these disvery insights, the team generated 7 task scenarios:
+We used these task scenarios to test the website with people in our target audiences.
+ +Solutions being voted on, which is on a printed paper sheet.
+HMW (how might we) better match expectations about icons (people touched them a lot!)
+Task 2 and Task 3: 24 hour. Task 6: Humidex night-day.
+Sticky notes are placed on top of the print out, sticky notes with similar ideas are grouped together.
+Sticky grouping 1:
+There are 14 voting dots on this group.
+Sticky note grouping 2:
+There are 2 voting dots on this group.
+Before making changes, we established a baseline score through moderated usability testing. 16 participants attended research sessions with their smartphones, in Toronto, Gatineau, Montreal and Ottawa. A further 4 participants were tested remotely on their desktop computers. Recruitment focused on individuals who were either employed or self-employed full-time outside the home. Participants were asked to complete the 7 task scenarios on the Canada.ca website.
+We watched videos of the testing sessions together. This helped the entire team understand the problems people had in trying to complete the tasks. We could observe and quantify behaviours and usability issues that caused task failures. We captured these observations in click paths (to understand how people moved throughout the site) and described the problems in detailed issues so we could address them during the design phase.
+For the baseline test, the overall findability rate was 35%, and the overall success rate was 33% across the 7 tasks and 140 task trials.
+To solve the long list of issues we captured, the team created a working prototype for a new design on the Github code-sharing site. We held several workshops to work intensively on issues that participants experienced.
+Throughout the design process, we tested content with people through small, informal "guerilla" sessions. These tests revealed problems with our designs that we were able to address before beginning a second round of full-scale moderated usability testing.
+Two smartphones are shown with 2 different versions of a web page. One is labelled "Baseline", the other "Redesign".
+In the "Baseline" page, the title is "Ottawa (Kanata-Orléans) ON” followed by a weather alert box in yellow with the text "Severe thunderstorm watch in effect". An arrow points to the alert box with the annotation "Baseline: Few people realized they could click the warning to see details about the storm”.
+In the "Redesign" page, the weather alert box in yellow has been changed: there is now a small warning icon, the text is now underlined to show it can be clicked, and there is a chevron on the right. This image is animated, and the phone screen slowly scrolls up and down to show the alert is duplicated below the forecast which is lower on the page. An arrow points to alert text with the annotation "Redesign: Everyone noticed and clicked the warning - either at the top or in the forecast".
+Two smartphones are shown with 2 different versions of a web page. One is labelled "Baseline", the other "Redesign".
+The "Baseline" page is animated and scrolls to the bottom, past the Current Conditions, Forecast, Averages and Extremes, etc. The page is very long and the commonly used links are found at the end. An arrow points to the list of links with the annotation "Baseline: Huge page to scroll. Participants didn't find important task links all the way at the bottom of the page".
+The "Redesign" page is also animated, and scrolls to just underneath the Current Conditions. This part of the page has been changed so that the content is in two columns, and the links are now easy to find. An arrow points to the list of links with the annotation "Redesign: Content has been streamlined. Short page. All participants found important links next to forecast".
+Once the revised design was ready, 20 new participants were recruited to complete the same 7 tasks. Our target was either 80% success, or an improvement of at least 20 points over the baseline score. The revised content and design improved findability rates from 35% up to 82%. Overall task completion success rose from 33% to 72%.
+This chart shows the task completion success rates across the baseline and redesigned validation test on the prototype for all 40 participants:
+ +Baseline measurement at start of project, validation on prototype redesigned by project team.
+Task | +Baseline | +Validation | +
---|---|---|
1. Navigate to local forecast | +87% | +100% | +
2. Understand & find precipitation likelihood | +19% | +94% | +
3. Snow statement: Driving | +25% | +75% | +
4. Alert: Open thunderstorm watch for hail | +6% | +63% | +
5. Radar: Decision based on current precipitation | +0% | +31% | +
6. Humidex: Cancel soccer game | +88% | +81% | +
7. Lightning map: Clear golf course | +6% | +63% | +
The team derived this set of 6 design principles that appeared to have the most impact on improving success rates:
+Email us at dto-btn@tbs-sct.gc.ca if you have questions, or would like to know more about this research.
+Let us know what you think about this project. Tweet using the hashtag #Canadadotca.
+