Phone: 612-624-5551
unews@umn.edu
24-hr number: 612-293-0831

Advanced Search

This is an archived story; this page is not actively maintained. Some or all of the links within or related to this story may no longer work.

For the latest University of Minnesota news, visit Discover.

Feature

High school exit exams are supposed to make sure students have truly "earned" their diplomas. But are they an accurate measure? And what happens to those students who fail? Associate Professor Rob Warren makes it his mission to find out.

Associate professor Rob Warren makes it his mission to find out the merits of high school exit exams.

Do they make the grade?

U professor researches the merit of high school exit exams

In addition to dealing with acne, first heartbreaks, and peer pressure at its worst, high school students now have something more substantial to fret about: exit exams. Over half of the states across the country now require that students pass exams before they can be granted high school diplomas. And like so many other issues involving public education (prayer in schools and No Child Left Behind, for example), mandatory exit exams are extremely controversial.

The exams are designed to ensure that students who receive high school diplomas have acquired at least basic reading, math, and science skills. Proponents of the exam see it as a way to make sure that students apply themselves throughout their high school careers--making an effort even when they're not being "graded." Supposedly, the exams also guard against diploma inflation, or the devaluation of a high school degree by a preponderance of unqualified recipients. Meanwhile, exam detractors argue that the tests measure only minimal levels of competence and--because teachers feel pressured to "teach to the test"--diminish the amount of "real learning" that happens in the classroom.

With a high school diploma riding on the results of exit exams, there is much at stake. Yet despite animated debate, until recently little research had been done on their actual effectiveness. To University of Minnesota sociologist Rob Warren, that seemed like a monumental oversight. So he set out to look for the facts behind the spin. "I've always done work on social inequality and education," he says. "And when I saw that the [political] discussion about exit exams was going on in the absence of real evidence, I decided that I needed to do that work."

Going back in time

Warren's first challenge was simply acquiring the relevant data-which was no easy task. Although some states have been conducting exit exams for several decades, in many cases they hadn't kept track of details that Warren needed for context.

"Researchers hadn't dug into minutiae of education policy in Utah in 1979, for example," he says. "So I spent a year and a half just finding out which states have had exit exam policies for which graduating classes from the late 1970s until today." Warren has made his findings publicly available so other researchers won't have to spend their time collecting this basic data.

In some cases, the results of Warren's research were unsurprising. For example, he concluded that in most cases, states with mandatory exit exams have slightly lower graduation rates than states that do not require the exams. Such a discrepancy is to be expected; it wasn't until Warren put this information in context that he saw reason for concern.

"Current policies on exit exams don't have the positive consequences that some people claim, but they do have the negative consequences that some people fear."

"We might be willing to accept a higher dropout rate if the majority of kids are learning more and are better prepared for the labor market," he says. "But we've found almost no impact at all." According to Warren's research, students in states with exit exams perform almost identically on nationwide standardized tests as students from states without the exams.

That's unwelcome news for proponents of the exams. As Warren explains it, "Even if we can't show that these students are smarter or better citizens, we would hope that they would at least do better on standardized tests."

The disappearing diplomas

Each year, the federal government releases statistics about high school graduation rates, and each year, the national average hovers around 90 percent. That's an encouraging number--if it's accurate. But Rob Warren has some concerns.

"The data being used are poorly suited for measuring this information," he says, noting that the data derive from notoriously unreliable self-reports of high school graduation and also count GEDs as equivalent to high school diplomas.

Statistics culled through other sources suggest that the percentage of those who don't earn a diploma may be as high as 30 percent. "If you do something simple, like look at how many ninth graders there were four years ago and how many students graduate this year, you'll find that you've lost about 30 percent," Warren says.

Also troubling is the fact that the students who don't graduate aren't randomly distributed across the population, but are concentrated in cities and among minorities.

Meanwhile, students who fail the exit exam--of whom there are tens of thousands each year--suffer serious consequences. For some, failing the test means losing the chance to graduate with their classmates. For others, it means losing the chance to graduate altogether. In the latter case, students are labeled "high school dropouts," joining a demographic that faces grimmer economic prospects, a higher incidence of criminal activity, and greater potential for health problems.

An imperfect measure

Taken as a whole, the results of Warren's study suggest that the tests aren't accomplishing the purpose for which they were created. "Current policies on exit exams don't have the positive consequences that some people claim," he says, "but they do have the negative consequences that some people fear."

Warren sees no easy solutions for policy makers--but he does believe that change is in order.

"Getting rid of exit exams is not the solution," he says. "And lowering the standards so that graduation rates are politically palatable isn't great, either. But if we want to set a higher bar for students, we might need to consider more complete assessments that aren't just achievement tests."

Portfolios of students' work, writing assessments, and other alternative skill evaluators might provide a more complete picture of how prepared students are to enter college or the workforce--although Warren acknowledges that the cost would be significant. The key to a more effective assessment, he asserts, is creating a better balance in how students are evaluated. Such a solution might not end lengthy standardized exams, but it would put less weight on a single outcome.