The headline on the national Communities in Schools press release last February was eye-catching: "Evaluation Ranks Communities In Schools as Most Effective Dropout Prevention Organization in America." The ranking, it said, was based on "the largest and most comprehensive evaluation of dropout prevention programs ever completed."
The attached 34-page report was much less dramatic. There was no independent entity doing long-term tracking of participants in various dropout-prevention programs. Instead, CIS had commissioned ICF International to look at how well the schools participating in the program are carrying out its work and suggest ways to improve the system. The group also pulled in some effectiveness data, including randomized controlled trials in three states (only two involved high schools) and effects on dropout and graduation rates from four other groups listed on the federal What Works Clearinghouse.
For those of you who want to delve in and draw your own conclusions, a description of the data sources in on pages 4-5 of the report. The comparison with four other programs is on pages 12-13, and data on differences between CIS schools and comparison schools not using the program is on pages 22-23.
My conclusion: There was nothing clear enough to justify a "best in the nation" headline, or even to grab the attention of general readers. I didn't loop back to this until Charlotte-Mecklenburg Schools repeated the claim as part of its 2012-13 budget pitch. At that point, I was surprised to discover that the What Works Clearinghouse does do effectiveness ratings on dropout prevention programs, but Communities in Schools isn't among them.
Talking to folks in the U.S. Department of Education, which runs the clearinghouse, led me to Mark Dynarski, a former clearinghouse director who developed the dropout-prevention effectiveness tracking. Now a consultant, he was quick to tell me he has "a financial relationship" with ICF and with the philanthropy that paid for the CIS study. But he wasn't complimentary about what he saw: Small numbers, "cherry-picking" of findings and a marketing spin on the conclusion. He said you might expect a group that started in 1974 to have extensive long-term studies: "They sure don't study themselves a lot, do they?"
Not that it's unusual to find a dropout prevention group with sparse data. Dynarski agrees with the journal article by John Tyler and Magnus Loftstrum saying such data is "woefully inadequate," even though boosting graduation rates is a top priority for public schools across the country. "The Dropout Prevention
Center/Network lists hundreds of dropout prevention programs in its online database of 'model programs.' Only relatively few of these programs, however, have been rigorously evaluated for effectiveness," they write.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment