Apparently it is not just a few community observers who question the work of the public schools. U.S. Secretary of Education Arne Duncan issued the following statement on Mapping State Proficiency Standards onto NAEP Scales: 2005-2007, which the National Center for Education Statistics released today:
Today's study confirms what we've known for a long time: States are setting the bar too low. In all but a few cases, states aren't expecting students to meet NAEP's standard of proficiency. Far too many states are telling students that they are proficient when they actually are performing below NAEP's basic level. At a time when we should be raising standards to compete in the global economy, more states are lowering the bar than raising it. We're lying to our children when we tell them they're proficient but they're not achieving at a level that will prepare them for success once they graduate.
Just to be clear, this is an issue distinct and apart from the problems at Isaac E. Young Middle School where wide disparities in test data on test that are and are not reported as part of determining Adequate Yearly Progress. This report shows that the districtwide 7-8% increase in test scores is a function of of lowering the bad. The 25-30% increase in test scores at Isaac Young is primarily the result of plain old fraud, with only a small percentage explained by the State creating easier tests.
The Executive Summary of the Report explains the method behind the report.
Since 2003, the National Center for Education Statistics (NCES) has sponsored the development of a method for mapping each state’s standard for proficient performance onto a common scale—the achievement scale of the National Assessment of Educational Progress (NAEP). When states’ standards are placed onto the NAEP reading or mathematics scales, the level of achievement required for proficient performance in one state can then be compared with the level of achievement required in another state. This allows one to compare the standards for proficiency across states.
The mapping procedure offers an approximate way to assess the relative rigor of the states’ adequate yearly progress (AYP) standards established under the No Child Left Behind Act of 2001. Once mapped, the NAEP scale equivalent score representing the state’s proficiency standards can be compared to indicate the relative rigor of those standards. The term rigor as used here does not imply a judgment about state standards. Rather, it is intended to be descriptive of state-to-state variation in the location of the state standards on a common metric.
The irony of a school district that has plastered Obama's image in every nook and cranny of their buildings being slammed by this SecEd is rich.
The report notes that although the NAEP assessment in reading and mathematics did not change between 2005 and 2007, some states made changes in their state assessments in these subjects during the same period, changes substantial enough that these states indicated that their 2005 scores were not comparable to their 2007 scores. If you look at Page 28 you will see that for Reading New York is one of the states and if you look at Page 35 you will the same for Mathematics.
There is additional data that showed the lowering of standards in New York from 2005 to 2007 is part of a trend that has continued ever since so that future studies will almost certainly show that 2008 and 2009 test data will not be comparable to 2006 and 2007 test data.
Every presentation of state test data by Dr. Korostoff is making exactly the sorts of comparisons that the U.S. Department of Education says cannot be made because the tests keep getting easier and easier, to the extent that comparing results from one year to the next is meaningless.