Placement Tests and the Unravelling of College Developmental Programs

Jim ShimabukuroBy Jim Shimabukuro

With so many developmental programs in statewide community college systems (SWCCSs) reliant on high-stakes placement exams such as COMPASS and ACCUPLACER, the recent reports out of Columbia University’s Community College Research Center (CCRC) must have come as a shock. For decades, these tests have been impacting students’ academic careers, and now a set of studies has surfaced to question the validity of their use in predicting performance.

At issue is much more than the efficacy of placement tests. Because the scores are the thread that runs through, binds, and defines the entire developmental structure, there’s really no telling what will happen as Judith Scott-Clayton and her colleagues tug, bit by bit, at the loose end that they’ve discovered. Scott-Clayton, an assistant professor of economics and education and senior research associate with CCRC, which is part of Teachers College, is the prominent figure in this effort to take a long hard look at a system that seems to be broken because of a fundamental flaw — placement testing.

In one of the studies, “Development, Discouragement, or Diversion? New Evidence on the Effects of College Remediation” (National Bureau of Economic Research, Aug. 2012), Scott-Clayton and Olga Rodriguez (also Teachers College) cite a finding that “remediation does not develop students’ skills sufficiently to increase their rates of college success” (2). Furthermore, they say that many students “diverted” into developmental courses could have done well had they gone directly into the college-level courses:

First, we find that potentially one-quarter of students diverted from college-level courses in math, and up to 70 percent of those diverted in reading, would have earned a B or better in the relevant college course. Further, our analysis of impacts by prior predicted dropout risk suggests that diversionary effects are largest for the lowest-risk students, and we fail to find positive effects for any risk subgroup. (3)

In fact, these students who weren’t at risk in the first place may be at greater risk of dropping out as a result of this needless diversion. 

In another study, “Do High-Stakes Placement Exams Predict College Success?” (CCRC Working Paper No. 41, Feb. 2012), Scott-Clayton continues to spotlight the disconnects between the use of placement test results and desired outcomes. She cites “a recent study of over 250,000 students at 57 community colleges across the country,” which reported “that 59 percent were referred to developmental math and 33 percent were referred to developmental English” (1).

The study “found that students who ignored a remedial placement and instead enrolled directly in a college-level class had slightly lower success rates than those who placed directly into college-level, but substantially higher success rates than those who complied with their remedial placement, because relatively few students who entered remediation ever even attempted the college-level course” (1). Here are some other excerpts from her study:

In English, the sole benefit of placement exams appears to be to increase the success rates in college-level coursework, among those placing directly into college-level, by 11 percentage points (from 61 percent to 72 percent). This measure may be particularly important to instructors, who may find it disruptive if too many students in their classes have very low probabilities of success. But these tests generate virtually no reduction in the overall severe error rate (in other words, while the placement tests reduce severe overplacements, they increase severe underplacements by the same amount), while at the same time dramatically increasing the proportion of students assigned to remediation and reducing the overall proportion immediately succeeding at the college-level. (27)

Reading/writing scores explain less than 2 percent of the variation in first college-level English grades. (32)

If remediation is effective, then it may make sense to have higher rates of remediation in order to maintain high success rates in the college-level course. However, existing research suggests this is not the case, at least for students scoring near the remediation cutoff. (36)

Allowing more students directly into college-level coursework (but perhaps offering different sections of college-level courses, some of which might include supplementary instruction or extra tutoring), could substantially increase the numbers of students who complete college-level coursework in the first semester, even if pass rates in those courses decline. (37)

Allowing students to test into college-level work using the best of either their placement scores or an index of their high school background could markedly lower the remediation rate without compromising college-level success rates. (38)

In a final study, “Predicting Success in College: The Importance of Placement Tests and High School Transcripts” (CCRC Working Paper No. 42, Feb. 2012), Clive R. Belfield (Queens College, CUNY) and Peter M. Crosta (CCRC) conducted “a replication and extension of Scott-Clayton’s analysis in a different context.” Here are some excerpts:

We find that placement tests do not yield strong predictions of how students will perform in college. Placement test scores are positively—but weakly—associated with college grade point average (GPA). (abst)

According to Horn and Nevill (2006; see Table 6.2), approximately two fifths of community college students take at least one remedial or developmental education course; other estimates are even higher (e.g., Office of Program Policy Analysis and Government Accountability, 2006). (1)

Bailey, Jeong, and Cho (2010) estimate that up to two thirds of these students fail to complete the required DE [developmental ed] course sequence that would enable them to take college-level classes. Over their college careers, only a fraction of DE students graduate with an award. (1) Looking across the Virginia Community College System, Jenkins, Jaggars, and Roksa (2009) found weak correlations between placement test scores and student pass rates for both developmental courses and college-level gatekeeper courses. Jenkins et al. also found no correlation between reading and writing placement test scores and performance in gatekeeper college-level English courses after adjusting for student and institutional characteristics; however, math scores and success in college level math were positively associated. (5)

First, overall correlations between placement test scores and DE grades are low. For Math 1, for example, the average correlation is 0.17. For English 1, the correlation is even lower, at 0.06. (13)

The predictive power of these placement tests on college GPA was very low; the best-predicting test (COMPASS Reading) explained only 5 percent of the variation in college GPA. (15)

Strikingly, placement scores explain almost none—and in some cases actually none—of the variation in college-level grades. (22)

More importantly, the tests do not have much explanatory power across a range of measures of performance including college GPA, credit accumulation, and success in gatekeeper English and math classes. (39)

Short of ignoring these findings and continuing business as usual, the coming months will probably see a flurry of activity to clarify, adjust, and change the policies that govern developmental programs, and, hopefully, the results will translate into significantly higher retention and program completion rates.

2 Responses

  1. Very excellent analysis, Jim. (Am I allowed grammatically to say that?)

    The emperor (in this case college placement tests) has no clothes. It’s about time.

    You have used this information to explain to me why 1/3 of my freshman class dropped out despite having stratospheric SAT scores. One-half of my class scored 800 on the math SAT — before re-centering.

    Most dropped out at the end of the freshman year. These students were interviewed too. Their high-school GPAs were taken into account. Today, Caltech still has the highest freshman SAT scores of any U.S. college or university. They still have a high dropout rate and have used intensive counseling to lower it.

    The ineffectiveness of placement exams points to the entire world of high-stakes testing and harks back to earlier discussions that began with an article by John Adsit. If high-stakes tests are not predictive of success, how can we believe that they’re a valid measure of learning? If you learned, really learned well, then you’ll succeed. Yes, unexpected events may intervene, but you should still have a success rate better than 2/3 and correlation that exceeds 5%.

    The only thing that a high score on a high-stakes test proves is that you’re good (on that day) at taking the test. However, test-taking is not a very valuable skill in the real world.

    I have come to the belief that it’s not the test itself that’s at fault. That is, you cannot change the outcome by changing the test as many suggest. The problem, they say, is with the test. If we had a test that tested “21st-century skill,” then it would be fine. Instead, I say that basing assessment on a single, large test, no matter how constructed, is wrong. It cannot be right.

    Some will argue that you have to test in order to assign grades. Nonsense. During a course, students do work. That work can be analyzed. Small “tests” of some nature can take place frequently as a natural part of learning progress. A “test” could be an essay or a project or even a written quiz. It even could be multiple-choice. In my opinion, a mixture would be best, but that’s a future discussion that’s a bit off this particular topic.

  2. […] By Jim Shimabukuro Editor With so many developmental programs in statewide community college systems (SWCCSs) reliant on high-stakes placement exams such as COMPASS and ACCUPLACER, the recent reports out of Columbia University’s Community College…  […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s