CUTGroup #28 – Chicago Open Data Portal

CUTGroup Test in Progress on mobile deviceFor our twenty-eighth Civic User Testing Group (CUTGroup) session, we tested the newly redesigned homepage for the City of Chicago’s Open Data Portal. The Open Data Portal allows users to find various datasets regarding the City of Chicago. The City of Chicago Department of Innovation and Technology (DoIT) is working with Socrata to redesign the Open Data Portal, focused currently on the homepage, to be more user-friendly for users finding datasets and represent data and technology initiatives and applications created with open data.

The main goal of this test was to understand how testers who have some familiarity with the data portal (even minimum) respond to the changes made to the homepage. We wanted to capture how residents with different levels of digital and data skills search and what is the best structure for the homepage to make searching easier. Lastly, we wanted to see how responsive testers are to the other content that relates to the programs and tech initiatives of the City of Chicago DoIT.

Segmenting

On September 22, 2016, we sent an email to 1,172 CUTGroup testers who live in Chicago, and notification via text message to our SMS-preferred testers. We wanted to know if they would be available for an in-person test on September 28. When segmenting our testers, we were interested in testing on a variety of devices. We wanted to include testers who have used the Open Data Portal in the past and those who have never used the Open Data Portal before. We also wanted to include testers who have all levels of data experience to see how user-friendly the search functionality is.

Screening Questions

During our initial call-out for testers, we heard from 60 CUTGroup members. We asked how familiar CUTGroup members are with the City of Chicago’s open data portal and learned:

5 – Very familiar                  8% (5)
4 – Familiar                          22% (13)
3 – Neutral                            20% (12)
2 – Not very familiar     30% (18)
1 – Not at all familiar          20% (12)

23 out of 60 respondents (38%) had used the Chicago Open Data Portal before. 6 of those respondents specifically used the open data portal to search for crime data.

Test Format

For this in-person test, each tester was paired up with a proctor who was either a City of Chicago DoIT employee involved in the project or a proctor from the CUTGroup proctor program. Proctors requested testers to complete tasks on the Open Data Portal beta, observed the results, and took notes on the testers’ experiences and feedback.We also wanted testers to test either on laptops that we provided or their own mobile device. We tested with 6 testers on laptop devices, 6 testers on Android mobile devices, and 5 testers on iOS mobile devices.

Results

Contributions to this test were made by our CUTGroup proctors. Erik Hernandez, Peter McDaniel, Christopher Gumienny, Steven Page, and April Lawson helped facilitate this test.  CUTGroup proctor, Christopher Gumienny, also helped write a lot of the analysis report for this test. The CUTGroup Proctor Program trains once highly active CUTGroup testers to learn more about usability (UX) testing and CUTGroup test processes.

On September 28, and on a very rainy day, 17 CUTGroup testers tested Chicago Open Data Portal beta at the Woodson Regional Library located in the Washington Heights neighborhood.

This presentation was shared with the City of Chicago DoIT team that highlighted top results from the test. 

Testers believed that the redesigned Chicago Open Data Portal homepage is designed for the general public and residents, but many expected access to city services.

At the end of this CUTGroup test, we asked testers if they felt that they are the target audience for the new, redesigned Open Data Portal and 13 testers (76%) said “yes.” Our testers included individuals who had no, little, or some familiarity with the current data portal, and 14 testers mentioned that this site appeared to be designed for the general public and residents.

3 testers specifically said the Open Data Portal is designed for people with moderate to high technical savviness. 1 tester mentioned it was for people interested in data analysis. Another tester said it was built for business owners, while another said it was built for developers.

One concern is that testers, even after reviewing the website, still expected more access to city services and resources that could be found on the official City of Chicago website. There are some possible solutions that would either provide this general service information to residents or better define the purpose of the website.

Identify access points to city services or action steps for residents.

There are many different levels of creating these access points to city services that residents expected when first viewing the Open Data Portal. A very simple option is to add a link to the City of Chicago website in the navigation bar or footer of the homepage.

A second, more complex level is at the category level. For instance, if a user clicks on the “Events” category there could be a link to the City of Chicago events’ page before users even begin to explore datasets. This level of access would require reviewing all of the categories and understanding if users associate city services with those category types.

The most complex level is at the dataset level. Since we conducted usability testing primarily on the homepage and the action of searching for datasets and reviewing information, this would require additional user feedback. The suggestion is to connect relevant datasets to the appropriate city services. Therefore, if a user is reviewing data about potholes, as one example, there could be a link to make a pothole service request.

Utilize the header image to clarify purpose about the site.

Data Portal Beta Homepage Header

“What’s this sales thing? It makes it look like it’s advertising stuff,” GPBlight (#4) said in response to reviewing the header images. When the homepage first loaded, a bug was immediately noticeable because the header images would stack and cause the page to jump upon load. While that is a detectable fix, we identified a larger opportunity to share information about what the website does. Very few testers reacted to the initial “data sale” image and those who did react had negative responses. The first header image, before rotating to either the crime or Divvy data, should define the purpose to the site while being as clear and direct as possible. The current “data sale” language and images confuse some users.

Searching for datasets was not always intuitive when using search or categories and testers often had to try multiple times to complete a task.

We asked testers to search for six different datasets; one was open-ended, while the rest proctors requested that they find specific datasets: building permits, a map of crime data, recently fixed potholes, active business licenses, and Chicago’s community boundaries map. We were interested to see how testers would search and find the datasets and what issues they came across during the process. While 12 testers (70%) indicated that finding these datasets was “easy” or “very easy” we witnessed that it often took testers multiple tries to find the correct dataset and some testers could not find the proper result or chose a relevant, but not requested dataset.

Categories were often chosen over search bar, but technical bugs and design influenced those choices.

Only 1 tester out of 17 (6%) used the search bar consistently for all of the dataset search tasks, whereas 7 testers (41%) used categories to complete all dataset search tasks. The remainder of testers used both search and categories to find what they were looking for depending on the type of dataset we requested them to find.

On mobile devices, the magnifying glass of the search bar was not working properly. Therefore, if testers attempted a search nothing would appear and testers thought the search was broken or that there were no search results.

When doing the initial review of the page, only 1 tester mentioned the search bar, which indicates that it is not a prominent part of the page. When asked about what actions they knew they could do from reviewing their page, only 2 testers mentioned the search bar. When asked to take their first action, 3 testers (18%) searched whereas 10 (59%) clicked on a data catalog category.

Data Portal Homepage search/categories

We understand that a lot of these testers did not have much experience with the data portal and the data catalog categories are useful for exploration purposes. Nonetheless, we realized that the search bar was not very prominent for testers. Suggested improvements included larger font, higher contrast, and a more prominent location on the page.

Categorization needs to be more intuitive and filtering a higher priority in the user experience or search needs to be more flexible.

While testers knew where to find building permit and crime data, finding recently patched potholes, active business licenses, and community boundaries was more challenging. In one example, 10 out of 11 testers who found the active business licenses dataset had explored other categories first.

Testers expected to find the potholes data in these categories: Public Safety, Environment, and Sanitation. Testers expected to find active business licenses in Administration and Finance. Lastly, testers looking for community boundaries looked in the Community category.

Some testers did use the help text on the categories to make decisions about which category to choose, but it was difficult to decipher the order of results and whether a category contained the relevant dataset after choosing a category. Testers did not use the filters on the search results page, and upon reviewing a few datasets without identifying a relevant one, would often rely on search or finding another category. Providing users with an understanding of an order of the results and then the ability to filter easily to find the most appropriate results would be beneficial to this experience. Improving the filter functionality would make it easier to have datasets in multiple relevant categories and give users ways to find the data more quickly.

We also witnessed that testers did not always distinguish between the type of data– whether it was a dataset, data lens, filtered view, or map they were choosing. This was evident when we asked testers to find a map of crime data and testers chose the dataset or filtered view results that were higher on the results list.

Data Portal Search Results Page

A primary search method should be defined for the user. Currently, users can choose a category or search, but both have their challenges. Outlined above are results and improvements for the categories since that was a prominent way testers searched. For each task, 11 or more testers used the categories in completing their search. If the search box was more prominent on the page and functioned better on mobile, we could identify if search was a better way for testers to complete these tasks. We did learn that testers had difficulties with the search terms that they used and spelling errors caused stopping points in the experience. The bug on mobile also caused a stopping point that forced testers to use categories. If search becomes the primary method of finding datasets it should be flexible to account for spelling errors and find relevant resources.

Testers liked the “Digital Chicago” & “How to Videos” resources on the homepage, but the “Chicago apps” were seen as being the most relevant.

Testers reviewed the other sections of the homepage including the “Digital Chicago” section that showcases recent articles by the City of Chicago’s Department of Innovation and Technology, “How to Videos” that showcase how to use the data portal and other tools, and “Chicago apps” tools that use open data that let people better visualize data (OpenGrid) or is related to city services (311 service tracker).

48% of testers thought the “Digital Chicago” section was “relevant” or “very relevant,” but testers felt that “Digital Chicago” did not mean much to them (calling this section “news” could be an improvement), they had questions about the articles and what they meant, and this section lacked descriptive content for residents.

66% of testers thought the “How to Videos” section was “valuable” or “very valuable” and a lot of testers liked receiving information via video, although there was a consensus that they should be shortened and users should know that they will be directed to YouTube.

76.5% of testers were likely to use the apps under the “Chicago apps” section because they saw them as useful to have for residents. 15 of our 17 (88%) testers were not familiar with these apps prior to visiting the website. The “Chicago apps” section met expectations from testers who at the beginning searched for or were interested in city services. There was a positive reaction to these tools and increased the likelihood of using the Open Data Portal in the future.

Data Portal Homepage Chicago apps section

For the resident user, we would suggest placing the “Chicago apps” section higher on the page because it relates to their needs. To avoid the confusion in distinguishing between the Open Data Portal and the City of Chicago website, adding a descriptive sentence about what “Chicago apps” is and why it’s on this page would be valuable. Sharing that these tools are using open data would describe the importance of the open data initiative while giving residents tools that they could use in their daily lives.

Screenshot of misaligned layout on mobile devices

Next steps

Based on this CUTGroup test, work is already in progress to make changes that respond directly to our CUTGroup testers’ feedback. DoIT and Socrata are considering changing the Open Data portal’s layout to include:

  • Featuring the “Chicago apps” section higher on the homepage
  • Changing the design of the search box to be more prominent
  • Distinguishing between sections to make each section more apparent and separate from the next
  • Improving layout to be work better on mobile devices

DoIT is also thinking about creating shorter, more digestible tutorial videos for the “How to Videos” section. Finally, the banner will be reviewed and redesigned to be more user-friendly on all devices.

We look forward to future iterations of the Open Data Portal, and seeing how resident feedback was included in that process.

Final Report

Here is a final report of the results:

 

Here is the raw test data:

CUTGroup #28 Open Data Portal

CUTGroup #24 – OpenGrid

Our CUTGroup Proctor, Nathalie Rayter, wrote the final analysis report for this test. The CUTGroup Proctor Program trains once highly active CUTGroup testers to learn more about usability (UX) testing and CUTGroup test processes.

CUTGroup #24 - OpenGridFor our twenty-fourth Civic User Testing Group (CUTGroup) session, we tested OpenGrid– an open-source interface developed by the City of Chicago that allows residents to search for, interact with and visualize City of Chicago’s datasets.

“OpenGrid has an intuitive user interface that’s highly responsive and efficient. The application is accessible from any devices (e.g., mobile, laptop, tablet or desktop). It’s unique, functional, and an easy to use tool to find city data. It extracts city data and displays it in an illustrative form which makes it easier for a user to search and understand the information.”
– City of Chicago Department of Innovation & Technology (DoIT)

The City of Chicago DoIT conducted some user research and testing of OpenGrid before this CUTGroup test, but it tended to be with high-capacity users or City of Chicago staff. The DoIT department wanted to better understand how residents with different levels of data familiarity and digital skills might use OpenGrid.

“Programmers, however, make up a very small portion of the overall population of a city – as a platform grounded in open data, OpenGrid’s mission is to be accessible by anyone who wishes to learn more about their city…

CUTGroup’s process provides a vital link between developer and user that the civic tech world sometimes lacks.  It’s one of the most human crowd-sourced enhancement mechanisms that a civic app developer could have in her toolkit.”
Sean Thornton, Program Advisor for the Ash Center’s Civic Analytics Network and writer for Data-Smart City Solutions

Segmenting

On April 12, we sent out an email to 1,132 CUTGroup testers who live in Chicago. We wanted to know if they would be available for an in-person test on April 20, 2016. When segmenting our testers, we were interested in including testers who described themselves as not being very familiar with using datasets in their personal or professional lives. We also wanted to include testers who had varying degrees of familiarity with what was happening in their neighborhoods. Lastly, we wanted to include testers with different types of devices.

Screening Questions

During our initial call-out for testers, we heard from 98 CUTGroup members. We received a lot of good information just from the screening questions.

  • 38% of respondents used the Chicago Data Portal before
  • 38% of respondents use or work with datasets in their professional lives, but only half of these respondents said they used the data portal in the past

We were looking to include 20-25 testers for this test. We wanted to test with half of them on laptops and the rest on a variety of mobile devices.

Test Format

For this in-person test, each tester was paired up with a proctor who at this time was either a City of Chicago DoIT employee or involved with the project (this test happened before we formalized a proctor program). Proctors guided testers through doing multiple tasks on OpenGrid, observed the results, and took notes on the testers’ experiences and feedback. This test also utilized A/B testing, where 13 of the testers (“A” Testers) began on http://opengrid.io/ (OpenGrid’s homepage) and the other 10 testers (“B” Testers) on http://chicago.opengrid.io/opengrid/ (the app interface). We also wanted testers to test either on laptops that we provided or their own mobile device. 

"A" testers reviewed this informational homepage

“A” testers reviewed this homepage.

"B" testers began directly within the OpenGrid app

“B” testers began directly within the OpenGrid app

Results

On April 20, we tested OpenGrid with 23 CUTGroup testers at the Chicago Public Library Legler Branch located in the West Garfield Park neighborhood.

This presentation was shared with the City of Chicago DoIT team that highlighted top results from the test. 

Homepage

Through the A/B format of this test, we learned how important the context from the OpenGrid homepage is. Testers who started directly in the app were typically more confused about next steps or misunderstood what OpenGrid does. We did hear, though, that there is a lot of information on the OpenGrid homepage, and parts of the homepage spoke to different audiences (especially technical audiences), which made testers feel that this website was not necessarily targeted to them.

“I wouldn’t understand this unless I was in technology. ‘Open source’ is definitely for the ‘tech folks’ and does not matter to me.” -Yoonie, #A18

7 out of the 13 “A” testers who started on the OpenGrid homepage said this website was targeted to Chicago residents, whereas, there was much less consensus about who the target audience is among testers in Group B.

Testers who started on the OpenGrid homepage typically clicked “Launch” (7 out of 13 testers clicked on this button) as their first step. “B” testers did not have a clear sense of what to do first, and therefore, did more exploratory actions to learn more about what OpenGrid does.

Ease of use

As the test designer, I was not prepared for how difficult some of the tasks would be to complete on both mobile and laptop devices. Test sessions typically lasted over an hour, and the majority of testers said that the tasks were “difficult” or “very difficult” to complete. We were lucky to get great, actionable feedback from our CUTGroup testers, which I translated to open GitHub issues on the City of Chicago’s repository.

Overall, how easy do you think it is to use the OpenGrid website?

Group A
5 – Very easy 0%
4 – Easy 23% (3)
3 – Neutral 23% (3)
2 – Difficult 31% (4)
1 – Very difficult 23% (3)

Group B
5 – Very easy 0%
4 – Easy 20% (2)
3 – Neutral 40% (4)
2 – Difficult 10% (1)
1 – Very difficult 30% (3)

We heard a number of improvements that the OpenGrid team could make to improve searching for, finding, and filtering datasets. 13 testers recommended that changes should be made to the OpenGrid search tools to make it easier for users to find the information they are looking for. 7 of these testers thought that the search bar should respond to addresses/zip codes or tags like “311.” 3 testers commented that the existing search filters are too complicated and that they should be simplified. Paloma (#B8) says, “Commonly used queries was easy; all other filters made it difficult.”

7 testers mentioned that they would improve the descriptions and instructions that orient the user to the OpenGrid interface. 2 of these testers recommended adding instructions that suggest how a user would interact with the page, such as suggesting a search or having sample questions. For example, Cleveland54 (#B3) says, “The advanced search panel should be pushed down with a note at top saying ‘welcome, you can search blah blah here.’ The landing page should have a description of the website and then launch to the site.”

We also saw that there needed to be a stronger connection and responsiveness between the filters and the map. Testers were making changes to their search criteria in the advanced search panel and were expecting to see those changes on the map. We also saw that testers would move the map, not create a map boundary using the tool, and then use advanced search expecting that the advanced search panel was connected to their map view and they would get results in that location.

Language

The biggest theme we saw throughout this test was that testers faced numerous challenges with the language on OpenGrid. From the homepage review, where testers thought that the language was targeted to the tech community, to the search and filtering functionality, the biggest improvement that could be made to OpenGrid is including more accessible language.

“It needs to be user-friendly for all Chicago citizens – terms like ‘geo-spatial filters’ don’t mean anything to most users. It sounds deep, but gets people nowhere. This isn’t Star Trek over here.” -Renee54, #A4

Testers also faced challenges viewing the dataset, and not always easily being able to distinguish what the field types meant. Searching required testers to know the exact field name when filtering down the results to more relevant information.

Our biggest recommendation to the OpenGrid team was to take time reviewing the site, and incorporating plain language to content, datasets, and functions.

Next Steps

Once the CUTGroup test was completed, we updated the City of Chicago’s GitHub repository with all pertinent issues that represented the top challenges our CUTGroup testers faced. The City of Chicago DoIT continues to work in the open– inviting developers to participate in open calls, sharing notes on status updates, and documenting their current and future work.

The recent OpenGrid v.1.2.0 release addressed issues that directly came from our CUTGroup testers:

The newest version contains improvements to OpenGrid to make it easier to use. The latest release contains friendlier language, an improved user interface to highlight more important features while deemphasizing more technical options, and reducing the number of mouse clicks to see data.

We are excited about opportunities through the CUTGroup to do usability testing on more data-focused websites and applications. While a lot of testers found OpenGrid difficult to use, out of 11 testers who said “Yes,” they liked OpenGrid,  appreciated that they were able to access new information and data that they were not aware of before this test. This test OpenGrid is an example of how we can continue to learn about how residents interact with data and their potential needs in using open data and then create better user experiences around understanding and using data.

Final Report

Here is a final report of the results:

Here is the raw test data:

CUTGroup #24 - OpenGrid

CUTGroup #21 – Digital Skills

CUTGroup #21 Focus Group SessionFor our twenty-first Civic User Testing Group (CUTGroup) session, we conducted focus groups to have conversations with residents about their access to digital skills trainings and resources. We wanted to see if residents know about the the resources in their own neighborhood and how they prioritize gaining new skills that center around technology.

This was a different topic for a CUTGroup test, but as we build more technology, we saw incredible value in talking to people about their digital skills. From this test, we wanted to understand

  1. How people talk about digital skills in the context of their lives and goals
  2. How much they prioritize improving their digital skills
  3. If they know of resources available to them or have used them
  4. How easy or challenging it is to access or take advantage of those resources
  5. Challenges that people face when it comes to accessing the Internet and technology and getting to their goal

We wanted to use this information to shape the new Connect Chicago website and gather some qualitative information on how Chicago residents think about and deal with these issues. Connect Chicago aligns citywide efforts to make Chicago the most skilled, most connected, most dynamic digital city in America. The Connect Chicago network includes more than 250 locations offering training, devices, Internet access, and helping residents engage with technology. Denise Linn, Smart Chicago’s Program Analyst, runs the Connect Chicago initiative and was key in designing this test, writing questions and helping take notes during the sessions. 

This in-person test took place at Literacenter, a collaborative workspace dedicated to literacy organizations, located at 641 W. Lake StreetWe chose to test here because it is a comfortable and flexible environment for testing and Smart Chicago is a member of Literacenter!

Segmenting

On October 20, we sent out an email to 941 CUTGroup testers who are Chicago residents. We wanted to know if they would be available for an in-person test on October 28 for about 45 minutes. We asked screening questions to gather information about how comfortable people felt using technology, whether or not they participated in digital trainings, and what types of skills they wanted to learn.

We looked through all of the responses and wanted to choose testers who did not have advanced digital skills. This meant not selecting testers who had coding skills, had advanced technology related degrees, or used sophisticated software systems for work or personal use. We wanted to reach people who had lower skill sets and might be interested in additional trainings or resources to improve their skills. We also thought testers would be more comfortable if they were grouped with others who were close to their own skills level. 14 CUTGroup testers participated in our focus group sessions.

Responses to Screening Questions

71 testers responded to our screening questions. Here are a couple of things we learned:

  • 28% of respondents said it is “Challenging” to use technology or learning new skills
  • 94% of respondents “Agree” or “Strongly agree” to this statement: “I feel comfortable using computers & technology” and the skills mentioned ranged from using email to coding.
  • 96% of respondents “Agree” or “Strongly agree” to this statement: “I want to learn new computer & technology skills”
  • Only 42% of respondents “Agree” or “Strongly agree” to this statement: “I am familiar with where computer & technology resources are in my community”
  • 70% of respondents participated in a computer or technology training class or program

Test Format

When designing this test, we chose to conduct focus groups. We were worried that conducting one-on-one interviews, we (as interviewers) would be influencing the responses and we were interested in participants talking with one another about their experiences. We thought there will be a lot to gain from those interactions that was worth the risk of participants influencing each other. For example, group interactions could capture a sense of community expectations about technology resources as well as the language and framing testers use while conversing about digital skill-building in Chicago. As the moderator, I played the role of asking questions, ensuring everyone had a chance to talk, and keeping the conversations away from being negative.

Before the focus groups, all testers completed a pre-survey questionnaire about the technology tools that they used. This helped us capture individuals responses before conducting the focus groups. We opted to ask many of the more personal, targeted questions about skill levels during this individual pre-survey questionnaire so as to formally capture that data and avoid putting testers on the spot during the focus group. For the focus groups, I used this script to guide the conversation, although we asked additional questions depending on the conversation.

Results

Pre-survey

In the pre-survey, we learned what technology testers are comfortable using, what they want to do better, and what skills they are or are not interested in learning. All of our results from this pre-survey can be found here.

We learned that testers felt most comfortable with these tools and skills: emails, creating a text or slideshow document, search engines, shopping online, and using Facebook and other social media outlets. We also learned that testers wanted to learn how to (better) do these things: creating a spreadsheet, using data visualization software,  or learning how to code.

Focus group #1

Our first focus group had 5 testers and we began the conversation with how these testers use technology in their own lives either personally or professionally:CUTGroup Test Discussion on Digital Skills

Tester #16, “Graphic Artist,” shared that he uses a laptop to do banking online and used to be in the graphic design industry and sometimes freelances but hasn’t learned the recent graphic software versions.

Tester #15, “techgeek,” uses mobile delivery apps like GrubHub and Caviar.

Tester #12, “GF,” works for a Chicago River kayaking company and uses Apple products for work, but uses Windows products for personal use and collaborative tools like Google docs help them transition between those platforms.

Tester #17, “Nonchalant,” responded to this question with “My job is to go to school!” but mentions that he checks his emails frequently.

Tester #14, “Rogers Park,” told us that he works in retail so he doesn’t use technology much for work, but appreciates this because he can interact more with people. Outside of work, he uses the Internet to manage their bank account, finances, and retirement funds and “stares at social media.”

Family networks rely on one another to teach digital skills.

Our conversation with this group focused a lot on the topic of how we use technology in a family setting. This first started from Tester #14 who does not appreciate that everyone is always connected in their family and we continued with how the other testers see technology being used in their families. While we heard a lot of experiences of technology creating a disconnected feeling because family members were on their own devices, we also learned about how teaching technology was a family activity.

Tester #17 helped his dad understand Facebook and “it was hard.” Now Tester #17’s dad is on Facebook and tags him “20 times a day!” A few testers shared their experience about teaching their parents to use social media, but two testers also had parents who had more advanced technology skills, like Tester #12’s mom building her own computers, and taught them a new skill.

Tester #16 bought his father an IBM computer, then an iMAC computer, and tried to teach him how to use the Internet so the family could contact him in Puerto Rico more easily. Tester #16 thought teaching his father was extremely challenging and that it might be “too late” for his father to learn how to effectively use this hardware.

Challenges in learning new technology derives from a feeling that technology is always changing, and maybe it’s changing too quickly to keep up.

In every focus group we conducted this evening, we wanted to talk about the challenges in learning new technology. While not all testers were at the exact same digital skill level, this group saw the challenges in learning digital skills as keeping up with new technology, remembering what they already learned, and devoting time to learning. “You have to catch the train,” described Tester#17 when talking about the speed of changing technology.

A few testers talked about Excel specifically as something that is difficult to learn. Tester #14 shared that “Memorizing formulas is hard. I really want to learn Excel…it seems so simple, but it’s not.” Tester #15 said that learning Excel was very challenging in school and said that if they had “someone right there,” some human interaction, then learning would be easier.

Tester #12 uses online coding courses, ex: Code Academy, to try to learn how to code but says that it is challenging to a complete a lesson and then go back to the series later and remember what they learned beforehand. “I wish there was a classroom experience like that for adults.” Tester #12 expressed a preference for learning technology skills in a classroom setting: “If I had a teacher, I could learn how to code.”

We continue to hear how much testers value human interaction when learning new digital skills and technology courses and saw an in-person class or instruction as necessary to be successful.

Focus group #2

Our second focus group also had 5 testers and we started again with how these testers use technology in their own lives either personally or professionally:

Tester #24, “Ready to learn,” said that at work they are on their PC using Excel and Outlook. Outside of work, she goes to “fun” websites, researches on Google and sometimes uses Facebook.

Tester #22, “Like to discover useful tools,” shared that they use the same tools in the same way in their personal and professional lives. “My work doesn’t require a lot of complex calculations.”

Tester #21, “I love learning,” uses eBay and Microsoft products.

Tester #25, “Not Dead Yet,” uses Microsoft, Google, and some communications software at work. While at home they read on their Kindle, pay bills online and manages other finances with technology, and play games on the Internet.

Tester #27, “Involved,” says he does data entry for work and mostly uses Excel. At home, he uses Word and websites like Google, Youtube, and Amazon.

There’s not always a clear technology goal, but keeping up is important and the format of instructions or resources might depend on what they want to learn.

Unlike the first focus group, this group did not share clear technology or digital skills goals. When asked what they wanted to learn, no software platforms or hardware was mentioned specifically. We did hear from some that testers wanted to “keep up” either with their children and family, or with job-related technology skills.

Tester #25 said his goal is to go paperless at home and is in the process of using different tools to scan all of his documents and receipts into one place to manage his finances.

When I asked testers about where they would go to learn new skills, Tester #22 shared that “You can learn almost anything on YouTube.” The example used was when you get a new phone, you can see videos online on how to use them from opening the box.

Testers #21 & 27 would rather go to a class because they prefer person-to-person contact and wants to ask questions. Tester #21, somewhat jokingly, added that “I have kids. They are my personal tutors.”

Tester #24 explained that if they need to know something quick or “one-off” (like fixing something) she would Google it, but if she wanted to learn a whole new skill or system, then a class would be best.

Technology classes could be organized around common problems, not tools.

When we talked a bit about ideal technology classes, we heard that some testers were interested in classes being organized around common problems that people experience which could be solved by technology or computer skills. Tester #22 brought up this idea and thought that this would be practical for class recruitment. Here were some ideas: “Make a will,” “Collect and organize recipes online,” or “How to go paperless at home.”

Tester #21 agreed that this approach is more emotional and personal to prospective testers.

Focus group #3

Our last focus group had 4 women testers, who had more advanced skill sets than we saw in the other groups. We started again with asking how these testers use technology in their own lives either personally or professionally:

Tester #37, “Almost Advanced,” said that at home she use she telephone, TV and tablets, and at work they use tablets. Tester #37  took a Microsoft Excel class at Association House.

Tester #32, “Striving for literacy,” said that they use completely different devices at work and at home because of company policy. Tester #32 works at Motorola and they have to separate their personal online activity from their work online activity. She says she is  slow to adopt in their personal life, but at work, she is eager to learn new tools: “If it’s at work, I want to learn and improve.”

Tester #31, “Recent Upgrade,” said that they had a similar situation as Tester #32, and observed from other answers that people often don’t adopt new technology, but they are pushed into it, especially at work.

Tester #34, “Reluctantly Tech Addicted,” said that they try not to use technology at home at all. They spend over 40 hours a week in front of a screen for work and don’t want to add to that. Tester #34 commented that there’s always new technology to learn at work and often work provides poor training without a reliable person to turn to for questions.

There were uncertainties when we asked testers if they consider themselves “tech-savvy” and those answers sometimes changed when they heard from others.

In the screening questions, we asked testers “When you think of the most tech-savvy (or technically advanced) person you know, what can they do that makes them so good at using technology?” We were interested in what ways is a person technologically savvy and if that because of the tools they use, the skills they have, or the general comfort level they have in learning new tools. Some responses from the screening questions included knowing how to code/create a website, being able to use different hardware, learning new skills quickly and then being able to teach others, technology coming as second nature, or just based on experience.

For this focus group, I specifically asked this group if they considered themselves “tech savvy” especially since we had a group of all women who based on their screening questions we did not think were as advanced in their tech skills. I was not sure if that was because they did not rate themselves as high in describing their skill set or if there was another reason. 3 out of 4 of these testers said they were “tech savvy,” but tester #37 said they thought they were tech savvy, but listening to other experiences, she changed her mind and said she wasn’t tech savvy. “I know the basics – Microsoft.” Even before she did the focus group her tester profile name was “Almost Advanced” and she participated in multiple digital skill trainings. The other women in this focus group immediately jumped in saying that having skills in Microsoft Office is not basic.

Later in the conversation when we discussed taking classes to learn new digital skills, Tester #37 said she took a basic computer class at an organization near her home even though she knew Microsoft Word and some of the skills they were teaching. I asked why she chose a class when she knew the software that was being taught, “Why start at the beginning?” Tester #37 shared that certifications are important because it impacts how much you get paid. She also shared that “I figured there might be something I didn’t know,” and she received a free laptop after taking the course.

Connecting residents to resources

Our final goal for this CUTGroup test was to understand how we can better connect residents to technology resources in their neighborhood.

The majority of these testers were interested in taking in-person courses where they could have personal support. Taking an in-person course over an online course, however, was dependent on the subject matter. Testers described many resources that could be found online but online resources are mainly useful for learning a quick skill (fixing something quickly) not an entire skill set like learning a new software platform.

Testers are looking for free classes that are in their neighborhood, and not everyone is aware of the resources that are near them. In the second focus group, we discussed how free classes are generally basic classes and there are not as many intermediate or advanced courses available for free.

Connecting residents to resources is dependent on marketing of those resources. In our last focus group, we talked about cross-collaboration between organizations. If you participated in  a class at one organization, those staff could and should help determine the next class you should take based on your new skill set. This extra guidance is key to driving learners to continue their trainings.

Testers  are not sure how to rate their skill set and need guidance in determining if a new class is right for them. As an organizer and designer of this test, I found it is hard to rate the skill levels of others and rating your  own skill level is even harder. Guiding residents to know what is the best class for them is extremely important and can be done in multiple ways:

  • Digital skills certifications provide a structure towards the next step in the learning continuum
  • Instructors or trainers at organizations can provide  better information to their students on next steps even when that class is outside of their organization
  • Showcasing what benefits come from  learning digital skills whether that is progression in skills or work-related can encourage new residents to participate
  • Being transparent in the class documentation and syllabus will allow residents  to review and determine if the class is too advanced or too easy based on what they already know

At Smart Chicago, we are excited to incorporate the ideas of this CUTGroup test into our Connect Chicago project and create news ways of talking about digital skills trainings and finding ways to help residents learn technology to improve their own lives.

Other Documentation

Here is a link to the notes from our focus groups that shares all of the topics we discussed during this test.

Here is the raw data of the pre-survey results:

Here is a link to our photo album from the test:

CUTGroup #21: Digital Skills

CUTGroup #17 – Ventra Transit App

Ventra-CUTGroup-TestFor our seventeenth Civic User Testing Group (CUTGroup) session, we tested the Ventra Chicago mobile transit app. The Ventra app allows riders to manage their Ventra account, buy mobile tickets for use on Metra, get notifications about their account, and other features.

This was an exciting opportunity for our CUTGroup testers to be the first to test the new app and provide feedback that could affect many public transit riders. This is a snippet about the test from the Chicago Tribune:

The CTA, Metra and Pace said they are working with a consultant with expertise on testing apps, Smart Chicago Collaborative’s Civic User Testing Group, to iron out undisclosed issues with the Ventra mobile-ticketing app, which is designed to let commuters purchase fares and manage their Ventra accounts from their smartphones.

“We get to make a first impression once, and we want the Ventra app to make a great first impression on CTA, Metra and Pace customers,” said Michael Gwinn, CTA director of fare systems.

This test had two parts:

  • A remote test where CUTGroup testers downloaded the beta version of the app and then used it in their daily lives
  • An in-person test where testers gave direct feedback and performed specific tasks in-app.

These two parts allowed us to collect information about app usage away from the test environment, and also have great conversations with our testers about what was or wasn’t working for them.

Here is what Tony Coppoletta, Manager, External Electornic Communication of the Chicago Transit Authority said about what the Ventra app team wanted to learn from this CUTGroup test:

The CTA has an incredibly varied set of use cases and we look forward to how all our riders can benefit from this app’s feature set. In addition to some public outreach to invite people to test our app before public release, we want to try and learn from the diverse and experienced testers the CUTGroup offers would be an excellent addition to our real-world user testing of the app. The focus is to trap bugs, ensure transactions happen as expected, good usability and a solid UX overall.

Segmenting

On June 28, we sent an email to all of our 880 CUTGroup members inviting them to complete an initial questionnaire about this new test opportunity. We wanted to know if they would be able to participate in a remote test component between July 7 and 15, and then participate in an in-person session either on July 15 or July 16, 2015. We asked a lot of screening questions to gather information.

We wanted to know what device they had, what software version they were using, and whether or not they had a Google Play or iTunes account. This was important, because they would have to be able to download the test version of the app. We also asked a lot of questions about using Ventra such as whether they had an account, if they load passes or value, if they use public transportation and what modes of transportation they use.

Screening Questions

During our initial call-out for testers, we heard from 91 CUTGroup members. We received a lot of good information just from the screening questions. This is what we learned about their main modes of public transit:

95% of respondents use CTA trains
90% of respondents use CTA buses
44% of respondents use the Metra
25% of respondents use Pace

We were looking for around 30 testers for this test, and wanted about half of the group to be Android users and the other half  iPhone users. We were also interested in getting a mix group of Ventra users and testers who ride different modes of public transit (CTA buses, trains, Metra, and Pace users).

Test Format

This test was the first time we told testers in our initial call-out email what app we would be testing. Normally we do not disclose this until the time of the test to avoid having people visit the website and app before the test.  However, we had a lot to tell our testers about the instructions and updates they were going to receive from Ventra about downloading the test version, so it made sense to give them all of the information upfront. We also had to make it very clear that testers were going to use or create a real Ventra account and make real purchases.

We offered testers a total of $40 for helping us, instead of the normal $20 per test. The reason for doing this was because they were doing two big parts  1) downloading the beta version and using it remotely, and then 2) participating in the in-person session. We respect our testers’ time and feedback, and we understood how worthwhile it was to have them participate in both pieces of the test.

Once testers were selected for a test, we emailed them to check their availability for the in-person session, and gave them instructions about how to get the beta version on their devices. Here is a part of the email we sent:

Thanks for your interest in our CUTGroup test of the new Ventra mobile transit app. You have been chosen to participate in this test. Here are some details about next steps:

  • If you have an iPhone, please download the TestFlight app from the iTunes store. You will need this app before you can download the Ventra app.
  • If you have an Android device, please make sure you have a Google+ account. You will need to join Google+ before you can download the Ventra app.

Early next week you will get an email invitation directly from Ventra to download the app. Even though this is a CUTGroup test, the fares you buy in the app are real—when you you load fare or tickets in the app, your chosen payment method will be charged and the fare will go into your Ventra Card’s transit account or toward real mobile tickets for Metra. Also, you will receive emails from Ventra giving you extra info or updates.

We understood that by asking testers to download the TestFlight app or get the beta version through their Google+ account that this would be adding extra steps that our testers might not be familiar with. We communicated regularly with testers to make sure they were able to download the app and to assist if any trouble arose. Only 1 tester could not download the app remotely, but was still invited to participate in the in-person session.

Remote test

Testers were invited to start using the app in their day-to-day as soon as they received the app invitation. We sent testers questions asking them about their experience, but then they were also invited to submit to Ventra’s bug report form directly, which some testers did. From the remote test, we wanted to gather an understanding of what the testers did first (create a new account, login to an existing account, etc), whether or not they added value, while also getting some feedback about their experience,  how easy they thought it was to use, and ways the app could be improved.

In-person test

After the remote test, we held two in-person sessions at The Chicago Community Trust on July 15 and 16, 2015 to have more in-depth conversations with testers about their experiences and to watch how testers completed specific tasks such as use “Trip Tools,” register a new card, purchase a Metra ticket. We asked testers to bring their own devices to complete part of the test, but we also had test devices (both Android and iPhone) available to test making purchases or creating a new account. By incorporating the test devices, we were to test more tasks, but we understood that testers might not feel as comfortable using these new types of devices.

Results

Remote Test

27 testers completed Part 1 of the CUTGroup test, which was a remote test where testers used the Ventra app in their normal day-to-day routine. During the remote test, only 8 testers added CTA transit value or passes to their account, while only 5 testers purchased Metra tickets. Here are some comments from testers who did not make a purchase in the remote test:

ChgoKC says, “I tried to add value to the account, but it asked for my Ventra password, which I entered (it’s 4 digits), and it said the password had to be 8 digits. I tried my password for my bank account, but that was ‘incorrect.’ So not sure what password it’s looking for.”

Gr8fl4CTA said, “I haven’t made a purchase because I have a monthly pass. I do not like that you can only add $5 to the card. There should be an option to either fill out the amount-especially since most rides are $2.25 or $2.50. People shouldn’t be forced to put $5 on their card if they don’t need all of that.

Here are some responses from testers who made a purchase during the remote test:

CTA/Metra Commuter/Evil Genius says, “Purchasing in general is easy. Multiple times, I’ve purchased a ticket while walking to the train. I love the ability to pay with my Ventra auto-reloading balance.” 

Frequent CTA / Occasional Metra says, “Purchasing the Metra tickets was very easy – I only purchased one-way tickets so far using the app, but it seemed very straightforward in choosing which stops and identifying the ticket type I wanted. The conductor that checked my mobile ticket on UP-N was educated on the system and he had no trouble accepting the ticket; that was my only concern about using the Ventra app to buy a Metra ticket.”

Systems engineer CTA rider says, “I added a 30 day pass to an existing transit account and it went very smoothly. I’m also REALLY, REALLY impressed with how split tenders for payments were implemented. It’s really intuitive and I really like that feature.”

Three testers mentioned that sometimes the app was slow to load. Additionally, 3 testers either expected or wanted Near Field Communication (NFC) capabilities. Otherwise, testers had a wide range of responses when describing their overall experience using the app. Here are some improvements we heard:

  • Show password
  • Be able to transfer value from one card to another
  • Improve the to back button so it does not kick the user out of the app
  • Route planning in Trip Tools

When asked about the visual design of the app, the majority of testers (67%) thought it was “appealing” or “very appealing.” Here are some useful responses from testers about visual design and how it might be improved: 

CTA rider AC would change the color of the cursor from white to black or another darker color. CTA Rider AC says “It is really annoying that I can’t see where the cursor is.”

Occasional CTA rider says, “The alerts in the Trip tools are a very light font and not very easy to read.”

When asked about how easy it was to use the Ventra app, 22 testers said it was “easy” or “very easy” to use.

In-person Session

26 testers came to the in-person sessions (14 came on Day 1, while 12 came on Day 2). Two of these testers did not complete the remote questionnaire, but still provided great feedback during the in-person section.

CUTGroup17-Ventra-Test

Trip tools

There was a lot of conversation around Trip Tools and ways to improve this feature. A lot of testers saw this as a feature that would get them to use the app on a regular basis. Testers shared with us by using the phrase “Trip Tools” they were expecting more route planning features similar to using Google Maps. For example, one tester specifically called out wanting to see a map during this part of the test. Testers thought this feature would help them choose bus and train routes to get to locations, not only tell when the next train or bus is coming. Changing the name of this feature might assist with people’s expectations of the tool.

9 out of 26 testers found the Trip Planning feature to be similar to or on par with other websites or apps that they use. Here are some responses from testers:

Brown line rider / occasional UP-N Metra rider says that this is “Similar to what you’d get on the CTA website only this app looks more this decade.”

CTA rider (KB) says, “It’s good, has about the same functionality that many other apps have, and the alerts here are easier to find” because they are shown more prominently.

9 out of 26 testers thought the Trip Planning features on Ventra were better than other websites or apps they currently use.

CTA rider (AC) says, “I trusted it more because it was affiliated with Ventra.”

Daily CTA Rider (MW) says, “Actually easier than my other apps! And I think that may be because of simplicity and consistency on the screen and not use of a lot of different colors.”

6 out of 26 testers thought this feature was worse than other tools that they use. Here are their comments:

Frequent CTA / Occasional Metra says, “This is stop driven. I think in terms of routes. The other apps think in terms of route and not stops. Other apps locate you on a map and tell you what is around. Helps you find the bus and route you need to take.”

Riding the 20 every day says, “It’s similar, but the lack of map function is a problem.”

Metra tickets

Testers were asked to purchase Metra tickets on the test devices. 31% of testers said that they travel on the Metra once or twice a month. Out of our group, 5 testers (19%) had never used the Metra before. Here are the top things that testers liked about the process of purchasing a Metra ticket.

  • Split payment
  • Using transit value as payment
  • Billing info saved
  • Stops are in order of location on the line

A recommendation for the Ventra app team would be to test the Metra ticket functionality with residents who use the Metra as their primary form of public transit. Since our testers live in Chicago, most of them tend to rely on CTA busses or trains.

Overall

24 out of 26 testers (92%) said they liked the Ventra app during the in-person test due to how easy it to use (7 testers mentioned this), the Metra functionality (7 testers), the convenience (5 testers) and the visual design (3 testers).

Brown line rider / occasional UP-N Metra rider said that this app is going to replace the Ventra website for him.

Metra rider88 says, “It was really convenient!” Normally Metra rider88 has to go down to the vending machine at the train station to add value via credit card and this can be very inconvenient due to the lines that form.

Frequent CTA / Occasional Metra says, “I will ride Metra more often because of the app. I wouldn’t have usually considered it as an option.”

Blind Transit rider says, “It has the promise to provide me with access on the go to the information my Ventra pass has. [This app] can be more accessible and useful than any other available option as a blind transit rider.”

18 testers (69%) said that they believe the Ventra app fits into their day-to-day routine. Here are some responses from testers who said, “Yes” the Ventra app fits into their day-to-day routine:

ChgoKC says, “It’d be similar to the parking apps, where you’d keep it around and put on the home screen of your phone.”

Mischievous Metra Maverick rides the Metra daily and says this “eliminates the need to look for a soggy ticket that is ruined halfway through the month.”

Geek Sheek says, this will probably be “one of his top five apps.”

Frequent CTA Rider JH, Occasional CTA Rider and CTA Savant  do not think that the Ventra app fits into their routine because they are looking for use of NFC technology so that they can use their phone at the turnstiles to get on trains or busses.  In addition, not all testers might use this app because their transit value gets loaded automatically and they might not ride the Metra or add additional value regularly. Gr8fl4CTA is not sure if she will keep the Ventra app because the trip planning does not work well enough, while TA is not sure because she knows her daily value and would not need to use trip tools often. TA says, though, “It is handy for adding a value and getting a pass.”

Updates

When we are onboarding new developers for a test, we emphasize the importance of the CUTGroup motto: “If it doesn’t work for you, it doesn’t work”.  Sometimes it can be a challenge for developers to invest the time to actually make changes based on feedback. That was not the case here. Here’s some of the updates they’ve told us they’ve made or are working on based on the feedback from testers:

  • “Trip Tools” has become “Transit Tracker” to more accurately represent what this feature does. The addition of being able to do trip planning is something that the Ventra app team expects to do in the future. Also as soon as the user clicks on “Transit Tracker” their “Favorites” and “Nearby stops” are right at the top of this page
  • Accessibility continues to be a priority, and one CUTGroup tester, Blind Transit Rider thoroughly helped the Ventra app team with their app’s accessibility functionality
  • There is now a “Show Password” button to help people enter their passwords, and the cursor is now a dark blue that helps testers know that their cursor is in the correct field. The Ventra app team also wants to make the password requirements more prominent when creating an account
  • The back button will be improved in order to be more in line with what Android users expect
  • More information is going to be added for Metra in the Transit Tracker feature to give better information about when the next train is arriving

Tony Coppoletta, of the Chicago Transit Authority, said

Our work with the CUTGroup has proved to be an incredibly valuable experience as part of our test plan for the Ventra app—both through the thoughtful feedback we received via the remote test and in affording us an opportunity to sit down face-to-face with a diverse range of riders of CTA, Metra and Pace and learn about users’ experiences together combining an open dialogue and structured testing.

CUTGroup is now a community of more than 1,000 residents in Chicago and all of Cook County who work together to make lives better through technology. This test has been an example of how these testers can be an integral part of changes to technology. It is exciting to see changes being made based on the direct feedback from testers.

CTA Rider says, “I love being able to help contribute to the development of this product.”

Route 66 Book Stalker liked testing something that is “important and impacts a lot of people.”

Final Report

Here is the final report with a look at all of responses for each question that we asked, followed by all of the testers’ responses to our questions.

Here is the raw test data:

 

CUTGroup #15 – Chicago Public Schools (CPS.edu) Website

Mobile View of CPS.eduFor our fifteenth Civic User Testing Group session, we tested the Chicago Public Schools (CPS) website. This in-person test took place at one of the Connect Chicago locations – Chicago Public Library Roosevelt branch at 1101 W. Taylor Street.

Here is what the CPS website team said about why they wanted to do a CUTGroup test:

“We recently redesigned our website to be parent focused and mobile device friendly. We improved our search results, tagged content by topic and age level and created a district calendar. We are curious to know if the new design is meeting the needs of our parents.”

Ted Canji, Jay Van Patten, and Matt Riel, who redesigned the CPS website were interested in testing these things:

  • Find School Information: Can parents find a school easily? Do they use the School Locator? Do they use the search feature? Are parents comfortable using search?
  • Website usability: We wanted to discover if there are areas of the website that are challenging to use.
  • Mobile devices: We wanted to learn about how easy it is to visit the CPS website on a mobile device. We also wanted to know if parents have visited the CPS website before and learn more about how they get information about schools.

Segmenting

On March 25, we sent out an email to all of our 868 CUTGroup participants. We wanted to know if they would be available for an in-person test on April 1 for about 30-45 minutes. We asked some screening questions to gather information. When choosing testers the main priority was to get parents of CPS students who had children in different grades. In addition, we wanted testers who had different types of mobile devices and some who have lived in Chicago for a long time, as well as parents who were newer to Chicago or their neighborhood.

We ended up having 10 testers participate in this test. 8 of these testers were from our CUTGroup community, but one tester brought in two friends who drove her to the test, and since they qualified as CPS parents, we allowed them to test. They also signed up to be part of the CUTGroup.

Screening Questions

We heard from 35 CUTGroup members during our callout for testers. Here is what we learned from our screening questions:

  • 74% of CUTGroup respondents have children who go to Chicago Public Schools
  • 68% of CUTGroup respondents have lived in Chicago for over 20 years

Test Format

For this test, each tester was paired up with a proctor. They were asked to bring their mobile device, and the proctor asked questions about the CPS website and then captured those responses. We not only wanted testers to review the website and do tasks to see how easy the site was to use, but we also wanted to understand how testers normally got information about their child’s school. We were curious if they have ever used the CPS website before or went online to search for information about different schools.

Results

Finding School Information

Past experiences

When we asked testers whether or not they ever searched online for information about the schools that their children attend, overwhelmingly (7 out of 10) testers said “No.”

Parent of Three Girls (#9) says, “I have looked before when she was first starting for information about the school. I’ve heard a lot of stuff about the school, but I like her teachers, and I go off of that.”

When asked whether or not they have ever visited the CPS website, 7 out of 10 testers said “Yes.” This is how these testers used the CPS website in the past:

  • 3 testers used it to choose a new school or High School
  • 2 testers used it to check the calendar of events
  • 1 tester wanted to learn more about Local School Councils, and
  • 1 tester wanted to find out about after-school programs

We asked testers how easy it is to find information about schools, whether online or in person. Half of the testers answered right in the middle about how difficult or easy it was to find school information:

5 Very Easy                  20% (2)
4 Easy                          10% (1)
3 Neutral                   50% (5)
2 Difficult                     20% (2)
1 Very Difficult             0%

We learned that a lot of our testers find out what is happening in their children’s schools from communication with the teachers either via phone, parent/teacher conferences, notes, email. Testers also learn about about what is happening in their school from other parents, weekly email newsletters and communications with teachers.

During the test

When searching for their child’s school, 6 out of 10 testers used the School Locator, but all testers thought it was an easy time searching. Testers were then asked to tell us the most important and least important information on the school’s profile pages. The most important information testers called out included: address/contact information, progress reports, number of students, test scores, admission and graduation rates. When asked about the least important information, 4 testers said that all of the information was important to them.

When all testers were asked to use the School Locator tool and search for an address, we saw that about half of the testers had some degree of difficulty using this feature. 4 testers had trouble with viewing the map and filters when holding their phones vertically; 2 of these testers were testing with their iPhones, while the other 2 were on Android devices. When holding their phone vertically, testers only saw the list of school names, but when clicking on the name it would open more information on the map which was not on view for the tester. The tester would have to hold their phone horizontally in order to see both the map and the list.

Usability of CPS.edu

We were very interested in learning more about how parents searched when finding information from their school or about other topics. The CPS website team wanted to see if parents were comfortable using the search features. Therefore, we asked testers to search for information that they were interested in learning more about. Half of the testers clicked on the appropriate topics to find out more information, while the other half of the testers used the search box. Only three testers had trouble finding the information they were looking for.

When testers were asked to search for their child’s school (in any way that they wanted), 6 out of 10 of the testers used the School Locator. All testers thought it was an easy experience to find their child’s school.

Active in the schools (#1) said, “It didn’t take but a hot second, I typed in Lane and it popped right up.”

A few testers experienced slow load times for the next pages. For example, Meekmeek (#10) downloaded a PDF of the calendar because the calendar did not show up on her screen, but the PDF was difficult to read since she had to zoom in and out. When clicking on “Calendar” it showed a white screen, and took a very long time to load.

A few testers reached PDFs during their tests and either they were not as easy to use such as in Meekmeek’s(#10) case, or they did not load. Parent of Three Girls (#9) chose a PARCC article in the headlines. When clicking on it, she was asked to download a PDF. After downloading, she clicks to open it but gets an error message saying that she cannot open the file.

As mentioned above, during this test we learned the biggest improvement that could be made was the School Locator and changing the view from a horizontal layout to a vertical layout. In general, all testers said they liked the CPS website and most of them had an easy time using the website.

“I just think the website is better than it used to be!” – High School Parent (#5)

“It is very easy to use. I like that they give you the address, phone, type, classification, programs offered… I like that they tell you a lot.” –Parent of k (#8)

Mobile devices

Normally for CUTGroup tests, we encourage testers to bring their own phones and devices for testing. It is important that we not only test mobile-friendly websites, but that we test them on their own devices. We not only learn about usability issues on different types of phones, but we also better understand how testers navigate tasks on their own device. 5 testers tested on laptops that we provided, and here is a look at the devices the other 5 testers used:

  • iPhone 5
  • iPhone 6
  • Nexus 5
  • Boost ZTE Max
  • Android HTC Desire 10 Boost Mobile

Besides some of the issues we outlined above, most testers using a mobile device had an easy time navigating the new CPS website. We were excited to test a website that not only reaches so many parents in the city of Chicago, but are also excited as the CPS website team continues to redesign with mobile devices in mind. This CUTGroup opportunity was a great example of getting feedback directly from residents (and parents), on their own devices, while better understanding their own experiences when finding school information.

Changes to CPS.edu

Since this CUTGroup test, the CPS website design team has made enormous changes to the School Locator tool including instructional pop-overs to encourage parents to start their search, get more information, and provide details about the map legend and the tools available.

New CPS.edu School Locator Screenshot

Other updates include users are now able to find schools quickly by using a new “Find Schools Around Me” button or by pressing and holding down on the map to drop a pin and see nearby neighborhood schools. The CPS website team also added Google street view functionality to have an interactive view of the school and its surroundings. Lastly, the “Compare Schools” feature allows users to compare different schools, so parents can find the best option for their child using information such as number of students, student performance, ratings, and more.

The biggest change we saw as a direct response from our CUTGroup testers having difficulties with the School Locator. The mobile layout of the School Locator page is now in a full-screen format, and users do not need to turn their mobile devices horizontally or move around the page to see all of the information they need. It is exciting when changes like this happen and the feedback directly from CUTGroup testers continues to be influential to websites and software that is so wide-reaching.

Final Report

Here is a final report of the results with the analysis of the questions we asked, followed by each tester’s responses, and copies of other questions we asked:

The raw test data can be found below with the written answers from every tester.

CUTGroup #14 – Chicago Cityscape

CUTGroup 14 Chicago CityscapeFor our fourteenth Civic User Testing Group session, we tested Chicago Cityscape, a website that helps residents understand how, where, and when building development takes place in neighborhoods. This in-person test took place at one of the Connect Chicago locations – Chicago Public Library Logan Square branch at 3030 W. Fullerton Ave.

The developer, Steven Vance, wanted to learn these things from the test:

  • Usability: Are there functions that are difficult to use or provide different information than expected? What do testers look at or do first when visiting the website? In what ways can Chicago Cityscape be improved?
  • Desired features: Steven wanted a better understanding of the type of information that residents and homebuyers look for when researching neighborhoods or potential properties to live in. What are the pieces of information that residents/homebuyers most interested in? What pieces of information are they least interested in?
  • Paid Services: Is there an interest in paying for these services at a low cost?

Segmenting

On January 30, 2015, we sent out an email to all of our 831 CUTGroup participants. We wanted to know if they would be available for an in-person test on February for about 30-45 minutes. We asked some screening questions to gather information. We were interested in getting residents who are involved and care about what happens in their neighborhood. We were also looking for people who were planning to buy a home. These homebuyers did not need to be as active in their community.

We ended up having 13 testers participate in this test.

Screening Questions

We heard form 62 CUTGroup participants through our callout for testers. We received a lot of good information just from the screening questions. Here is a look at what we learned:

  • 60% of CUTGroup respondents are currently renting
  • 37% of respondents are planning to buy a home in the next year

When we asked how involved testers are in their neighborhood, this is what we they said:

5 Very Involved                     23% (14)
4 Involved                          35% (22)

3 Neutral                                 31% (19)
2 Slightly Involved                3% (2)
1 Not at all Involved              8% (5)

Test Format

The format of this test was proctored sessions with proctors working one-on-one with the testers. Testers looked at the website using either their own laptops or laptops we provided while proctors asked questions and took down their responses. It was also the first time we used the software, ZoomText, which we added to one of the laptops for a tester who has low-vision.

Ever since the Roll with Me CUTGroup test, we have been trying to reach more people through the CUTGroup. Having a diverse set of individuals, such as low vision or blind testers, helps us better understand the structure and usability a website. For example, during this test, Architect Dropout (#11) tested using iPad voice-over commands. From that experience, Steven realized that the map failed because the voice-over commands selected each map tile individually and she could not select the specific pinpoints on the map. This led to a very detailed Github issue and conversation on accessibility improvements for Leaflet. This is a goal of CUTgroup — to not only uncover user experience issues on one website or app, but to help all developers build better websites that work for everyone.

Results

Information

12 out of 13 testers are interested in learning more about what is being built or torn down in their neighborhoods, and most look to community news outlets (such as DNA Info or Everyblock) for this type of information. When learning about a property, testers said that these are the most important pieces of information: property information, tax information, neighborhood data, and violations at that specific address.

When testers were asked to search for an address in their neighborhood, most of them searched for their home addresses and called out these pieces of data on the “Address Snapshot” page: demographics, Aldermen/ward information, property taxes, crimes, and then permits. While there was not a one-sided response to the “least useful information” question, a lot of testers had questions about the data they were seeing on the pages. Here are a couple of ways that testers would improve the Address Snapshot page:

  • Change up the design elements: 5 out of 13 testers mentioned changing elements of the color and design as ways to improve the Address Snapshot page. HomeSweet (#12) says, “Visually the page is rather simplistic and not in a way that makes it easy to read. It’s not immediately clear that the ad is an ad.”
  • Highlight or change information structure: 3 testers thought that information on the pages should change and different pieces of information should be highlighted. Testers also mentioned that there was a lot of information on these pages, and it would be better if they were organized with better definitions and descriptions.

When looking at the permit information, testers had an easier time navigating the permit information on the map versus the information listed on the community or neighborhood area. Testers again had a lot of questions about the data they were looking at, here are some examples:

“Are these permits already approved?” –Big on Community & IT Field #geekchic (#3)

“TIF district! I want to know what that is” –Non-Profit Pro (#6)

“What’s to be gained by showing the total estimated costs of the projects? What does change time mean?  How do you get access to the records? Do you have a way to get it quickly?” – Construction Watcher (#2)

83% of testers thought that the permit information was useful or interesting. Two testers wanted to compare permit data in their neighborhood to other neighborhoods. In addition, one tester, Chicago Explorer (#10), thought that seeing the history of the neighborhood through permits is interesting. If she was starting a business or looking for a property it would be especially useful.

Usability

For the most part, testers found it very easy to navigate the website. There was some confusion when ads had a light background and matched Chicago Cityscape. In addition, testers had questions about the map icons and had questions about the data types or the timeline of what was being displayed.

When first exploring the website, most testers did not an overwhelmingly clear first step or action. 4 out of 13 testers (31%) viewed or clicked on the map on the homepage, 3 testers (25%) typed in an address, while the rest of the testers did a variety of other first actions. In addition, when testers were first reviewing the homepage, 5 out of 13 testers (31%) were not clear who the audience was.

Audience

Throughout the test, testers mentioned useful and not useful information on every page and often distinguished which information would be useful to one group but not to others. As mentioned before, testers had questions about the audience ever since the start of the test.

Architect Dropout (#11) says, “It doesn’t seem super targeted to your average Jane, seems targeted to a business, not very consumer targeted… doesn’t seem to want to sell me anything, feels more like an industry site.”

Both HomeSweet (#12) and Big on Community & IT Field #geekchic (#3) said that they thought this was a real estate/homebuyers website.

Chicago Explorer (#10) and First time looker (#9) both mentioned the paid subscriptions on the “Find Permits” community or neighborhood pages. First time looker (#9) thought the paywall in the description stifles here curiosity. She hasn’t used it enough to make a decision – there is no information on a trial readily available, no free content on the outset, which turns her off because she doesn’t even know if it’s valuable to her yet. Chicago Explorer (#10) added that “Subscribe– doesn’t appeal to me, but would if I were a mogul.”

Chicago Cityscape has a great opportunity to teach residents about building/property information and permits and then provide a platform for residents to learn more about what is happening in their own neighborhoods. The best way that can happen is to continue to include definitions and descriptions of terms.

It is also important to consider audience for this website and provide the proper pathways for individuals to get the information that they need. Whether they are an involved community member, home buyer, or a contractor. Defining the purpose of this site and how this data is useful to multiple groups of people in the forefront (on the homepage) will give individuals the grounding to use the data provided.

Final Report

Here is a final report of results with the analysis of the questions we asked, followed by each tester’s responses, and copies of other questions we asked:

The raw test data can be found below with the answers from every tester: