Tests for Higher Standards (TfHS) supplies high-value, carefully state-aligned assessments.

Call Dr. S. Stuart Flanagan at 804-725-7997 or stuflanagan@aol.com for information and pricing.


delivers online tests, scans, scores, and produces flexible, powerful reports.

 If you need immediate tech. support,  please use the e-mail feature on the ROS site, e-mail support@rosworks.com, or call 866-724-9722 or 804-282-3111.


Passage Readability Levels

A reader of our site recently asked us about the readability level of our passages –
“I gave my students the Virginia Grade 4 Grade Level Reading Test yesterday. I noticed on the answer key that 40 out of 50 questions were from passages on the 5.7, 6.3, and 6.5 reading levels. What is the rationale for including passages that were written well above the reading level of most fourth graders?”

As our answer has general interest, here it is:
The readability levels we supply are a guide, they are not concrete, and they are not set in stone. Much of what Hemingway wrote scores readability of 6 or below. Can a fifth grader fully understand what Hemingway is writing? Sure, they may be able to manage the sentence structure and get a grasp on the literal meaning of most of the words, but Hemingway’s writing’s significance only begins with the actual print on the page. The deep significance of what is said is so much more.

Yes, the readabilities come out a little high from time to time. On the Simulation Tests, we try mightily to give students passages with readabilities that are more tightly aligned with the “correct” grade level. In the SABs, we are pretty loose, judging the overall passage with the readability as one of our concerns when rating a passage’s difficulty level. Our SABs range from above grade level to below grade level, so a variety of skill levels can access our materials. In the Grade Level Tests, we do range from the grade level up a grade or two in readability, but only when we judge that the overall passage is still accessible to students. We don’t let the readability score be the end all and be all of the decision to place a passage at a certain grade level.

Furthermore, we have had our Grade Level tests reviewed by teachers specifically with the question of difficulty in mind, and have not had a complaint about our passage leveling in several years. But that has not stopped us from being very diligent in maintaining an accurate database of passages. In fact, a few years back, Scott [Reynolds] and I revisited all of our passages and moved some up or down a grade as we deemed necessary. Additionally, our materials have been used daily by teachers and students, and we have received mostly positive words regarding our material difficulty level. Our tests have certainly become more accessible to students in the 7 years since I’ve been here.

John Anderson [senior English test writer and editor]

I would add that every different readability formula produces somewhat different grade levels for nearly any passage. Sometime the levels indicated are very different – as much as three grades different. Usually, that is when there is something peculiar about the passage. Specifically, we have calculated the Lexile® levels for many of the passages. (We can’t publish them, as that requires a license.) We found that the Lexile score is often quite different from the Flesch-Kincaid level we report. Which one is “correct”? That is a complex matter that has been argued for years.

One thing we have discovered is that material which interests students is “easier” to read than boring material. We try to make our passages both interesting and informative. In English tests, the difficulty of any question is a function of both the difficulty of the passage and of the question itself. Both aspects have to be considered.

What I suggest is that you try out the material with your students. Look at the test scores and also do a follow-up with the class on the test. Ask the students directly if the passages were hard to understand. If you do find that the material really is too challenging for them, please let us know. We do want to know that.
One final point — The difficulty of any test question which has a “display” — a written passage, a graph, a picture, a table, a diagram, a sound passage, a video, etc. — is always a function of both the display and the question asked in reference to it.
David Mott