Next Step Guided Reading Assessment

2018: UX • User research • Prototyping

The Next Step Guided Reading Assessment (NSGRA) is Scholastic’s four-step process for helping teachers pinpoint students' reading levels. This digital product was created as a companion to the print materials so that teachers could easily enter and analyze student data.

NSGRA print components.


Introduction

I was the lead UX designer for this project while working at SOS Brooklyn, a design agency contracted by Scholastic. The NSGRA had existed for a number of years as a print product that was broadly used across the country. It also had a small digital component that allowed teachers to create charts from the data they collected. However, Scholastic recognized that this tool was dated and didn’t actually offer teachers much insight into the data. Very few teachers were using it or even knew it existed. The goal of this project was to create a robust data collection and reporting tool that would offer teachers actionable insights that were too difficult and time-consuming to glean from the print materials. My work led to a fully clickable wireframe prototype.

• • •

Getting a lay of the land

We started our process by conducting a survey to understand the current perception and usage of the NSGRA. Since we were starting from the ground up, we wanted to cater our tool to how teachers were actually using the product and focus on specific features teachers wanted.

The following is an analysis I conducted of our results:

Online use is low

The results showed that most users are not currently using the NSGRA online portal. 52% of user surveyed never log into the portal, and only 17% of users use the portal consistently, logging in 6 times or more per year. In addition, many of the users surveyed are new to NSGRA, with 46% having used the product for a year or less. Some teachers stated that they didn’t have access to the online portal or didn’t know it existed:

Recommendation

Since the majority of users don’t use the online portal and many are new to NSGRA, we believe most users won’t have an attachment to the current design of the product. This presents the opportunity to explore design options unrestricted by the previous version.

“I did not know there was an online component I could access.”


Offline use is high

Despite the fact that most users are not using the online portal, use of the print portion of the product is high. 88% of users surveyed assessed students’ reading levels 3 or 4 times per year.

Recommendation

79% of users rated their technology skills at a 4 or above on a 5 point scale, which leads us to believe that the majority of users are open to using new technology. This finding in combination with the fact that usage of the NSGRA print materials is high, indicates that many users would be very open to using the online portal if they were aware it existed or had access to it:

“Is there a cost to using the online system? If not, I wish it was something my school looked into using instead of making the work harder on ourselves.”


Users found reports useful

Generally speaking, most respondents who had used the reports found them either ‘useful’ or ‘very useful’. However, individual student reports were more likely to be seen as ‘extremely useful’ as compared to group or class reports. The reading interest survey reports were viewed as the least useful.

Recommendation

We see the current views of these reports as a baseline we can refer back to once the new version of the product has been created and tested. With the new design, we hope that more of these reports will be viewed as ‘very’ or ‘extremely useful’. These findings also direct us to place special emphasis on the reading interest survey reports, which could be most significantly improved, and the individual student reports, which are viewed as the most valuable reports.


Most users share NSGRA reports

64% of users shared reports with parents, 67% shared reports with other teachers, and another 67% shared reports with school administrators.

Recommendation

An easy way of sharing reports should be implemented in the new product.

“We use Google for all of our school related documents. If there was a way to upload it straight into our Google Drive that would also be useful.”


Users have access to reliable wifi on multiple devices

80% of users reported that they had access to reliable wifi in their classrooms. 98% of respondents reported having access to at least one chromebook, laptop, or desktop computer. 45% of users also had access tablets. 34% of users responded that they would access the NSGRA on their smartphones if that was possible.

Recommendation

The findings suggest that there aren’t any major technological or logistical barriers preventing users from accessing the online portal. The major barriers to engagement, as mentioned earlier, are districts granting access to teachers and awareness of the online portal. Making the product available for tablet and mobile is also important since many users expressed interest in this.


Users want to adjust reading benchmarks

62% of users responded that they would find the ability to adjust pre-set reading benchmarks to fit their district’s grade-specific reading goals useful. 35% of users thought this might be useful.

Recommendation

Due to the large number of users who want this option, we believe it should be implemented in the product. It’s worth exploring what other custom settings users would find useful.

• • •

In translation

Once we had the survey information, I studied the print materials extensively so that I thoroughly understood the process of using the product and why teachers were being asked to complete it.

Sample page from the NSGRA teacher’s guide.

Since this digital component was a companion to the print product, we wanted to make sure it actually enhanced the use of the print product. In many ways, this meant translating what existed in print to a digital format. Teachers needed to be able to use the print product and instantly identify its relationship to the digital one. Since there are inherent differences in how print and digital media are laid out, this created some interesting design challenges.

For example, the reading record was laid out with information in multiple columns that drew your eye to the bottom of the page and back up again. This is impossible to recreate digitally because users only scroll down a page to consume information, not up and down.

Image of Reading Record showing directionality of eye movement.

I translated the hierarchy and layout of the reading record to a scrollable page:

Wireframe of Reading Record showing digital translation of printed page.

We also needed to translate actions taken with a pen on paper into the closest ‘digital action.’ Below are some examples:

1. Comprehension questions that required the user to circle one of 3 scores became a slider:

Wireframe showing sliders used for comprehension questions.

2. A survey completed by the student retained the same layout with enable/disable functionality:

Wireframe of Reading Interest Survey.

3. A chart that required the teacher to circle incorrect spellings became a series of checkboxes:

Wireframe of Developmental Word Inventory.

• • •

Making things easy

Since this product would require teachers to take an extra step beyond completing the process on paper, it needed to be as easy to use as possible. We recognized that if entering the data was too laborious, teachers wouldn’t be motivated to use it. We considered how teachers would actually be sitting down with the completed paper assessments and what features would make data entry quick and painless.

One simple but significant UX solution was to create a sticky bar at the top of the page that kept a student’s name persistent as the teacher was entering data. This made clear at all times the exact student the teacher was working on. The ‘Next Student’ button would take the teacher to the next student in alphabetical order. If the teacher had alphabetized her assessments, she could continue clicking that button to get to the next student without having to search for them individually. The dropdown with the class roster also easily allowed teachers to jump between students and included a ‘complete’ state to show which already had data entered.

GIF showing sticky bar with student name dropdown and ‘Next Student’ button.

We also added an automatic calculation feature where items that the teacher would have had to manually calculate on paper would be instantly calculated and added once she entered the applicable data. This feature could significantly reduce the amount of time the teacher spent on each student’s reading record and ensure greater accuracy of the scores.

GIF showing automatic calculation feature.

Another ease of use issue was the resources section of the site. The previous site made finding the print resources extremely difficult. The documents were listed with the long titles of the step they fell under (ex: Developmental Word Knowledge Inventory) instead of how they’re more commonly referred to, which is the step number. They were also displayed in a list format that was difficult to read and contained categorical headings that didn’t actually apply to all of the documents. Additionally, the filters only let you sort them by type of document. So, if you wanted to know all of the documents necessary for one particular step, you’d have to know the exact type you were looking for and find them all individually.

Resources section of old NSGRA site.

My redesign of the resources section showed the documents as tiles, where the title, type, step, and grade where clearly labeled. It also included a keyword search.

Resources section redesign.

The main view was changed to show featured documents that applied directly to that teacher (such as all of the documents for 2nd grade). I also added more powerful filters that interacted with each other to make it easy to find the exact document you were looking for. For example, once a step was selected in the ‘Step’ filter, only the document types that were included in that step would be shown in the ‘Type’ filter.

Image of Resources redesign showing filter interaction. Step 3 only includes assessments, answer keys, and recording sheets.

• • •

Adding value

Because we’d be asking teachers to adopt something that they were already doing without, the product needed to offer significant value. We wanted to provide features that made taking the time to enter the student data worth it.

One of these features was a system of completion alerts. This system would help the teacher keep track of any piece of data that wasn’t entered for any particular student. The teacher could then use these alerts to remind herself to enter this data later (i.e. in a situation where a student wasn’t in class that day) or simply dismiss it. No tracking system was previously available with the print product.

GIF showing a teacher’s ability to view, dismiss, or add missing data through alert system.

Another major component was the reporting. We were dealing with many of the same questions about reporting that we had previously handled with Literacy Pro (i.e. how can one system show class progress, student progress, and guide you to struggling students?). For this reason, we decided to apply the same reporting system I helped define. The basis of this system is a persistent class roster that interacts with the data visualizations. A more thorough explanation of this system can be found in my Literacy Pro case study. Below are some examples of the NSGRA reports:

We also created a highly detailed student report. The inspiration for this came out of our survey results, in which most teachers expressed that they shared student data with parents, administrators, and other teachers.

Any data the teacher had entered for the student would appear in the individual report, as well as links to any documents the teacher had uploaded. The layout hierarchy was designed to feature the most important pieces of information for sharing the document (i.e. the student’s current reading level). It was also designed to be print-friendly.

GIF showing a individual student report.

• • •

Finding a balance

One of the most significant things we learned in surveying current users is that they often skipped steps in the process or assessed students’ reading levels outside of the recommended timeline. In working on this product, we had access to the authors, who stressed that their research had culminated in a process that was designed to give teachers the most accurate depiction of a student’s reading level.

Flow chart of NSGRA process.

They argued that skipping steps in the process deprived teachers of important information to get the most holistic view of their students. In other words, a teacher could find that a student was above grade level in reading accuracy but below grade level in listening comprehension, skills that are assessed in different steps. She could then use this information to focus specifically on comprehension with the student.

The milestones for assessing students’ reading levels were important to adhere to because the assessment materials were written to test particular skills for that grade level at that exact time of the year. In other words, a second grader is expected to be developmentally in a different place in October versus November, and the assessments were designed to assess skill level for that exact week of the school year.

Chart showing the correct timing of assessments for the NSGRA.

The authors recognized that teachers often needed to assess particular students at different times of the year (i.e., if a student is making significant progress, or a new student has joined their class), but the class itself should be assessed 3 times a year during designated weeks. This also had implications for the reporting part of the product. The district wanted accurate data on student assessments, which would be difficult to achieve if teachers didn’t complete the full process three times a year.

This presented an interesting challenge, because we wanted to give teachers the flexibility to use the product the way they wanted to, but we also wanted to provide enough information and guidance to encourage them use it as it was designed. We developed a number of UX solutions to achieve this goal.

The first was the inclusion of completion bar on the top of the dashboard, which played off of the Zeigarnik effect, or our need to finish things we’ve started. Once a teacher logged in, she would see this bar at the top of the page, letting her know exactly where she was in the process. This completion bar set the expectation for how many times the process should be completed and when.

Image of dashboard showing the completion bar at the top.

I also designed the main assessments page with three ‘funnels’ that directed the teacher to complete one of the three milestone assessments (beginning of year, middle of year, or end of year). This design further compelled the teacher to complete the assessments the correct number of times and at the correct time.

However, because so many teachers wanted the ability to enter data outside of the assessment milestones, we created the option to add a ‘non-milestone assessment.’ The button for adding a non-milestone assessment was intentionally designed to be much smaller to discourage its use. Non-milestone assessments would not send any data to the district and would be kept separate in the reports.

Assessments empty state design showing the 3 funnels for adding assessments. Non-milestone assessment button appears in the bottom right corner.

Once a teacher chose a funnel, she could choose the steps she wanted to complete. I created a series of checkboxes that built a progress bar, which persisted for the entire data entry process.

GIF showing the process of selecting a funnel and choosing steps to build the data entry progress bar.

Though we offered this flexibility, we still employed empty states where data from skipped steps would appear with easy access buttons to encourage the teacher to go back and complete them. The product was still designed around the presence of all four steps, and we deliberately decided not to hide space dedicated to these steps when the teacher decided not to complete them.

Image of dashboard showing empty states for incomplete steps with easily accessible buttons.

The funnel system allowed us to control the data that was being entered and ensure the product was being used correctly. In other words, once a teacher chose the beginning of the year funnel, she could only enter data for beginning of the year materials. She could not enter data for middle of the year or end of year materials. If a teacher administered the assessment with the wrong materials, she would need to complete the process again with the correct materials to enter data.

The funnel system also created important restrictions around the data sent to the district, so that the correct data for each milestone was segmented correctly. Had we left the data entry system open, with the teacher choosing whatever data to add at whatever time, this would have been impossible to control.

• • •

Conclusions

Once the prototype was completed, we conducted user testing with teachers who currently used the NSGRA print product. We created a user testing flow with the main goal of testing how teachers navigated the product. We made some small tweaks to language and minor navigational elements based on feedback, but for the most part, testing validated our UX decisions.

Overall, I felt confident in the architecture of the product. I think we created a product that could offered value when used in conjunction with the print product. I also felt like our solutions gave teachers flexibility while still guiding them to assess their students in the most ideal way.

Had a continued working on the project, I would have argued for further iterations to transition to an entirely digital process, where students could complete the independent assessments on a computer and teachers could complete reading record notes using a stylus and tablet. This would eliminate the data entry process entirely and save teachers an enormous amount of time.

Previous
Previous

Literary Pro: EdTech Application

Next
Next

Simon Data: Service & Product Design