Multi-Level Text Sets: Leveling the Playing Field or Sidelining Struggling Readers?

    Multi-Level Text Sets: Leveling the Playing Field or Sidelining Struggling Readers?

    Hiebert, E. H., 2018. Multi-Level Text Sets: Leveling the Playing Field or Sidelining Struggling Readers?. TextProject, Inc., Santa Cruz, CA.

    It’s a compelling idea—a set of texts on the same topic but with different text complexity levels. With a range of texts on the same topic—easy, moderate, and hard—all students should be able to engage in the same follow-up discussion after reading. No one is stigmatized by getting a text that is clearly aimed at lower readers. If the texts are provided on the computer, even better! It’s a teacher’s dream come true!

    Indeed, several publishers offer just such sets of texts for schools and districts to purchase. In this essay, I analyze multi-level text sets from three publishers in order to establish if these text sets—especially the versions aimed at students who need the most support—have the features that support reading development. In other words, do these texts fulfill struggling readers’ needs for—and dreams of—becoming proficient readers? 

    It should be noted that the texts in the analysis are only one part of the publishers’ programs. In two of the programs (A and B), the multi-level texts are central to the programs, but comprehension questions are also provided. In Program C, the multi-leveled texts are part of an extended intervention. What students are given to read in a program, however, is of vital importance. The volume and features of texts influence the kinds of reading opportunities students have to increase their reading capacity. My interest lies in whether the texts for readers, especially struggling readers, have features that research indicates increase reading capacity. 

    I will use Lexile levels for the analysis because, in the case of all three programs, that is the manner in which the texts have been determined to be at different complexity levels. An overview of the Lexile Framework precedes the summary of the analysis, since a basic understanding of what makes up a Lexile is necessary to interpret the results of the analysis. 

    Some Background on Interpreting Lexiles

    Rather than assigning grade levels to texts, the Lexile Framework classifies complexity of a text using Lexile units, where a unit is defined as “1000th of the difference between the comprehensibility of the primers and the comprehensibility of the encyclopedia.”1 The number of units assigned to a text is based on a formula with two components: a syntax measure and a vocabulary, or word frequency, measure. 

    Syntax is measured by the average length of the sentences (MSL) in a text. The overall Lexile (L) is quite sensitive to changes in sentence length.2 For example, the Lexile level of The Wind in the Willows can move from 1370 Lexile (career and college ready level, according to the staircase of text complexity in the Common Core State Standards3) to 360 Lexile (in the range of grade one, according to the staircase of text complexity) simply by breaking some of Kenneth Grahame’s very long sentences into shorter ones.4 

    The Lexile level of a text is more impervious to changes in vocabulary than changes in sentence length. The semantic component is called the mean log word frequency (MLWF) in the Lexile Framework and referred to as “vocabulary” in this essay. In the Lexile analysis, the vocabulary measure is established by giving each word in a text a ranking based on its relative status to all unique words in the MetaMetrics database. The average frequency of the vocabulary in a text is transformed into a logarithm to account for the vast discrepancies in word frequencies (e.g., the occurs about 68,006 times per million words of texts, while 5,542 different words such as dine and zebra5 are predicted to occur once in the same amount of text).

    A high MLWF number indicates a lower average in the frequency of the words in a text and a lower MLWF number indicates a higher average of the frequency of the words in a text. Other than that guideline, the vocabulary measure of the Lexile is hard to interpret. For interested readers, additional information on interpretation of the MLWF is available in a previous essay that I have written.6

    The Study and Its Findings

    In this study, I analyzed four levels of four sets of leveled texts from each of the three publishers. For Program A, which has 11 different levels, I took the middle text in each of the four levels designated as: (a) falls below, (b) approaches, (c) meets, and (d) exceeds. When Program B provided five texts per topic rather than its typical four, I took the first four levels. Program C consistently provides four levels of texts. The figures that follow give the average for the four texts of each publisher at the four levels. 

    Lexiles: Figure 1 shows that average Lexiles increase with each succeeding level of complexity in each of the programs. These increases are not surprising—after all, the programs were written to comply with Lexiles. The increases are not of similar magnitude in each of the programs. In Program B, the increase is not substantial between the Approaches and Meets levels. Typically, the increases are at least 100 or more Lexiles—the amount of growth associated with about a grade-level of reading. 

    Figure 1

    Lexiles

    figure1

    Sentence Length: Sentence length is consistently higher from band to band, as shown in Figure 2. Indeed, the progression is almost identical to the progression of the Lexiles—a phenomenon that I have already discussed.

    Figure 2

    Sentence Length

    figure2

    Vocabulary: In looking at the vocabulary results in Figure 3, remember that higher numbers on the vocabulary measure indicate fewer rare words, while lower numbers indicate more rare words. Texts aimed at the readers with the most needs typically have fewer rare words. But in one program (Program B), the difference between Level 1 and Level 2 is miniscule. In another program (Program C), vocabulary demands of Level 1 are more challenging than those of Levels 2 and 3. In all of the programs, the vocabulary tends to be at the challenging rather than accessible end of the scale.

    Figure 3

    Vocabulary

    figure3

    Length of Text: Figure 4 shows that the programs differ substantially from one another in the amount of text that is provided. In Program A, the difference between the amount of text for the lowest readers and those for the three other groups differs dramatically. Whereas students in the Falls Below category read an average text of 200 words, the three other groups read texts that are approximately three times longer. The differences are not as dramatic in Program B, but the higher the level of students’ reading, the longer the texts they get.

    Figure 4

    Sentence Length

    figure4

    In Table 1, I have taken an excerpt from a magazine article7 and illustrated how the amount of information decreases from level to level. The lengths of the illustrative texts follow the pattern in Program C. 

    In the Falls Below and Approaches levels, the content is cryptic. The details that might engage readers, such as the “cold fear” that Gordievsky felt, are gone. The content that explains why Gordievsky believes that he is in danger is gone. Students who receive the Falls Below text would be hard pressed to engage in critical thinking or inquiry about what they have read.

    Table 1

    Illustrative Texts for Different Levels

    Level

    Features of Illustration

    Illustration13

    Exceeds

    Lexile: 1140; Mean Sentence Length: 19.13; Mean Vocabulary: 3.43; Number of Words: 153

      On May 17, 1985, Oleg Gordievsky was at the pinnacle of his career. A skilled intelligence officer, he had been promoted a few months before to rezident, or chief, of the KGB station in the British capital. Moscow seemed to have no clue he’d been secretly working for MI6, the British secret intelligence service, for 11 years.

      That Friday, Gordievsky received a cable ordering him to report to Moscow “urgently” to confirm his promotion and meet with the KGB’s two highest officials. “Cold fear started to run down my back because I knew it was a death sentence,” he told me decades later. 

    He’d been back at headquarters in Moscow only four months earlier, and all seemed well. Now, he feared, the KGB’s counterspies had become suspicious and were recalling him to confront him. If he refused the summons, he would destroy his career but, if he returned home, he could be shot. 

    Meets

    Lexile: 920; Mean Sentence Length: 14.00; Mean Vocabulary: 3.48; Number of Words: 112

      On May 17, 1985, Oleg Gordievsky was an intelligence officer who had just become the chief of the KGB station in London. Moscow seemed to have no clue he’d been secretly working for MI6, the British secret intelligence service, for 11 years.

      That Friday, Gordievsky received an order to report to Moscow “urgently.” “Cold fear ran down my back because I knew it was a death sentence,” he told me. All had seemed well when he had been at headquarters in Moscow four months earlier. Now he feared that the KGB had become suspicious. If he refused the summons, he would destroy his career. If he returned home, he could be shot. 

    Approaches

    Lexile: 810; Mean Sentence length: 11.17 Mean Vocabulary: 3.37; Number of Words: 67

      On May 17, 1985, Oleg Gordievsky had just become chief of the KGB station in London. Moscow did not seem to know he had been working for the British secret intelligence service for 11 years.

      That Friday, Gordievsky was ordered to return to Moscow “urgently.” He had been in Moscow four months earlier. Everything had seemed fine then. Now he feared that the KGB had become suspicious. 

    Falls Below

    Lexile: 440; Mean Sentence Length: 8.25; Mean Vocabulary: 3.46; Number of Words: 33

      Oleg Gordievsky was the head of the KGB in London. He also worked as a British spy. 

      One day, Gordievsky was told to return to Moscow. Had the KGB found out his secret? 

    Examining the Findings Relative to Research

    What do we know about sentence length? Sentence length is the variable, along with Lexiles, that differs substantially from level to level. Is there any evidence that reading short sentences increases students’ capacity? No. In fact, the answer from research8 indicates the opposite. Short choppy sentences can impede comprehension. Why might that be? When sentences have been chopped up, connectives have been eliminated. Many struggling readers have a hard time making such inferences.9

    Longer sentences may be associated with harder texts, but a correlation does not indicate causality. To date, no research has shown that giving students texts with short sentences aids in comprehension. 

    What do we know about vocabulary? Vocabulary is essential to reading comprehension. Vocabulary represents knowledge, and the words and information that students have about the world around them predicts how well they comprehend. 

    The Lexile Framework uses mean word frequency to represent vocabulary development: Texts with a higher number of frequent words are regarded to be easier to comprehend than texts with a lower number of frequent words. Word frequency, however, is a difficult construct on which to build vocabulary instruction. Some frequent words have numerous meanings and take on different parts of speech, making them quite complex (e.g., page). 

    What do we know about reading volume? In 1977, Richard Allington asked the question, “If they don’t read much, how they ever gonna get good?”10 The same question applies to opportunities to read provided by these programs. The students who read the Falls Below texts read about half as much as their peers who receive the Meets or Exceeds texts. 

    Is there evidence to indicate that reading less is the way to gain capacity? Quite the contrary.11 Without a steady and appropriate reading diet, students will simply not grow as readers. It’s as simple as that. 

    Conclusion

    For struggling readers, two features of text matter a great deal: the vocabulary and the volume of reading. The perspective underlying the texts for struggling readers in these programs of leveled texts goes in the opposite direction of what struggling readers need. Rather than getting texts with more accessible vocabulary, they get texts where the ideas have been sacrificed or simplified to comply with the aim of achieving a target text complexity. 

    Is it possible to have texts with similar content but that support struggling readers with sufficient amounts of text and appropriate vocabulary? The answer is yes. The StepReads12 project of ReadWorks.org is doing just that. And the good news about StepReads is that all texts are entirely free! These texts will be the focus of a future essay.

    References

    Stenner, A. J., Burdick, H., Sanford, E. E., & Burdick, D. S. (2007). The Lexile framework for reading (Technical Report). Durham, NC: Metametrics, p. 6.

    Hiebert, E.H. (November 30, 2012). Readability formulas and text complexity. Paper presented at the annual meeting of the Literacy Research Association, San Diego, CA.

    NGA & CCSSO. (2010). Common Core State Standards for English Language Arts and Literacy in History/Social Studies, Science, and Technical Subjects with Appendices A–C. Washington DC.

    Hiebert, E.H. (June 26, 2012). Syntax and text complexity: A classic text goes from college-career level to first grade (Frankly Freddy blog). https://textproject.org/topics/beginning-reading-reading-automaticityfluency-and-core-vocabulary/

    Zeno, S. M., Ivens, S. H., Millard, R. T., & Duvvuri, R. (1995). The educator’s word frequency guide. Brewster, NY: Touchstone Applied Science Associates. Inc.

    Hiebert, E.H. (June 12, 2012). Teaching text complexity: Why look at word frequency? (Frankly Freddy blog). Retrieved from https://textproject.org/library/frankly-freddy/teaching-complex-text-why-look-at-word-frequency/

    Wise, D. (November 2015). The phantom menace. Smithsonian, 46(7), p. 28. 

    McNamara, D. S., Kintsch, E., Songer, N. B., & Kintsch, W. (1996). Are good texts always better? Interactions of text coherence, background knowledge, and levels of understanding in learning from text. Cognition and instruction, 14(1), 1-43.

    Pearson, P.D. (1974). The effects of grammatical complexity on children’s comprehension, recall, and conception of certain semantic relations. Reading Research Quarterly, 10, 155-192.

    Allington, R. L. (1977). If they don’t read much, how they ever gonna get good? Journal of Reading, 21(1), 57-61.

    Rasinski, T., Samuels, S.J., Hiebert, E., Petscher, Y., & Feller, K. (2011). The relationship between a silent reading fluency instructional protocol on students’ reading comprehension and achievement in an urban school setting. Reading Psychology, 34(1), 76-93. 

    StepReads can be found at readworks.org

    Wise, D. (Nov, 2015). Thirty Years Later, We Still Don’t Truly Know Who Betrayed These Spies. Smithsonian.com. Retrieved on Dec 5, 2017 at: http://www.smithsonianmag.com/history/still-unexplained-cold-war-fbi-cia-180956969/#TeqYKYrLuaF2qTgv.99