February 27, 2008
Common Core off to a Dubious Start
February 26, 2008
NYT: History Survey Stumps US Teens
Fewer than half of American teenagers who were asked basic questions about history and literature during a recent telephone survey knew when the Civil War was fought, and one-quarter thought that Christopher Columbus sailed to the New World sometime after 1750, not in 1492.
The results of the survey, released Tuesday, demonstrate that a significant proportion of American teenagers live in “stunning ignorance” of history and literature, according to the group that commissioned it. Known as Common Core, the organization describes itself as a new, nonpartisan research and advocacy organization that will press for more teaching of the liberal arts in American public schools.
February 18, 2008
'Best and brightest' dim on history
College freshmen earned an average grade of 'F' — or just 53.7 percent — when asked a series of questions about U.S. presidents and key historical events from their times in office. After four years of college, their knowledge didn't improve.
College seniors got just 55.4 percent on the 60-question quiz given to 14,000 students at 50 colleges and universities around the country as part of a study designed to test their knowledge of America's history, government, international relations and market economy.
"In this election, we are focusing on the youth vote, and it's great that more kids are coming out to vote. But we worry that it's become a kind of cult of personality," says Richard Brake, director of the Lehrman American Studies Center at the Intercollegiate Studies Institute in Wilmington, Del., which commissioned the civic learning study, conducted by researchers at the University of Connecticut's Department of Public Policy...
February 17, 2008
WaPo: The Knowledge Connection (E.D. Hirsch)
February 12, 2008
EdWeek: U.S. ‘Dashboards’ Offer Data on State Achievement
U.S. ‘Dashboards’ Offer Data on State Achievement
More than 20 years ago, when federal officials sought to publicize data portraying the relative quality of the states’ school systems, the best statistics they could find were scores on college-admissions tests and state-reported graduation rates.
Now that states have results from their own tests and state-level results from the National Assessment of Educational Progress—as well as a wealth of other data—the Department of Education is publishing a two-page report on each state that gives a glimpse of the quality of its K-12 schools.
The reports also should answer the public’s questions about the success of the No Child Left Behind Act in promoting increased student achievement, said Secretary of Education Margaret Spellings.
The data reports show that gaps in achievement between minority and white students are narrowing, the secretary said. And they show the proportion of schools making their achievement targets under the NCLB law; nationally, the rate is about 70 percent.
“When they see in black and white what [NCLB] means in their own backyard, … I think it’s very useful for parents and policymakers,” Ms. Spellings said in an interview last week.
Data Quality Improves
The amount and quality of data available today represent a dramatic improvement over what was available in the so-called “wall chart,” a state-by-state compilation of resource inputs, performance outcomes, and population characteristics that the Education Department published for six years, starting in 1984 under Secretary Terrel H. Bell. ("E.D. Issues Study Ranking States On Education," Jan. 11, 1984.)
Educators challenged the validity of those comparisons, and Mr. Bell at the time acknowledged the limitations of the data. But he defended such a “scoreboard” as a way of raising awareness of the need for school improvement.
Since 1990, NAEP has published state-by-state results for 4th graders and 8th graders from its reading, mathematics, science, and writing tests based on a sampling of student achievement. States voluntarily participated in the NAEP tests until 2003, when the NCLB law required them to join the assessment for the reading and math exams.
In addition, the NCLB law requires states to create their own tests to measure the reading skills and mathematical abilities of students in grades 3-8 and once in high school.
Those data are the heart of the new Education Department reports, which the agency is calling “dashboards.” Like the dashboard of a car, the reports give people “pieces of information … in a way that are quickly consumable and usable,” said Secretary Spellings, who unveiled the reports last month in a speech at the National Press Club in Washington.
A chart in each state’s report compares how its students are doing on the NAEP tests and the state’s own exams. The chart also disaggregates the data by the performance of white, African-American, Hispanic, and low-income students.
Each state’s report lists the high school graduation rate as reported by the state. It compares that rate with the so-called average freshman graduation rate, which estimates the percentage of 9th graders who earn their diplomas within four years in that state.
While the data aren’t exhaustive, they are far more extensive than what the federal government was able to publish shortly after the 1983 report helped set off a wave of school reforms, said Chester E. Finn Jr. Mr. Finn was an assistant secretary of education under Mr. Bell’s successor, William J. Bennett. Lamar Alexander ceased publishing the “wall chart” shortly after he became secretary of education in 1991.
“We have had approximately two decades of movement toward state-level results data that can be compared,” said Mr. Finn, who is the president of the Thomas B. Fordham Foundation, a Washington-based think tank that supports accountability measures and school choice. “I expect we’ll see . . . the data will continue to get better, faster, more precise, more fine grained, better able to be analyzed in various useful ways by various constituencies.”
Long before the department first published its dashboards, Mr. Finn said, states, nonprofits, and companies published a variety of Web sites that allow users to find data and, often, compare schools, districts, and states on measures such as student achievement, spending, and other data.
Giving NCLB Credit
The increase in the amount and quality of data is mostly the outgrowth of the NCLB law, said Chrys Dougherty, the research director of the National Center on Educational Accountability, an Austin, Texas-based nonprofit group that supports data-based efforts to improve schools.
“It’s something people need to look at,” Mr. Dougherty said. “It gives you an idea how tough their state test is or how high the standards are.”
But Mr. Dougherty and other advocates of such use of data are laying the groundwork to help states measure whether their students are graduating prepared for college or the workforce.
ACT Inc. has developed a method for reporting such percentages based on students’ scores on the ACT college-admissions exam, which the Iowa City, Iowa-based nonprofit produces. Other states can calculate similar percentages if they benchmark their high school exams to the ACT’s standards.
“That’s an indicator that’s coming down the pike,” Mr. Dougherty said. “But somebody has to do the data analysis.”
'Troublemaker' Finn Recalls Setting 'Proficiency' Standard
In the NCLB world, Finn may be the reason why we're so concerned about "proficiency." Back in the 1980s, when he was an assistant secretary at the Department of Education, Finn complained that the National Assessment of Educational Progress didn't deliver meaningful results. The public couldn't understand, he said, the meaning of an obtuse scale score for the nation. He led the lobbying effort to persuade Congress to create a version of NAEP that delivered results for every state. Once it passed, Finn became the first chairman of the National Assessment Governing Board and served on it for eight years. He was the architect of NAEP's performance levels ("advanced," "proficient," and "basic"). When Congress was looking to set a goal for student achievement, it chose proficiency. To hold states up to a national standard, it required states to participate in NAEP's reading and math tests and increased the frequency of those tests to every other year. Today, NAEP is the most cited source for declaring states' definition of proficiency to be too easy.
Finn covers all of this in "Troublemaker," and he acknowledges that the achievement levels have been controversial. But he leaves out that their validity has been questioned by the research community. In 1991, NAGB's own consultants said it "must be viewed as insufficiently tested and validated, politically dominated, and of questionable credibility." NAGB fired the consultant, according to this Education Week story. In 1999, a National Academy of Sciences report called the process of setting them "fundamentally flawed." To this day, every NAEP report includes a footnote saying that the achievement levels are "developmental."
Economy brakes school spending
Board members in the District last week for an annual conference said shortfalls in state budgets, coupled with pessimistic predictions about local revenues, are forcing them to consider ways to trim next year's budgets, which they are working on now.
About half of the states are facing projected budget shortfalls, according to the Center on Budget and Policy Priorities, a District-based liberal research group.
The downturn in the housing market has precipitated a drop in state revenue from sales taxes associated with construction materials, furniture and other goods, said Liz McNichol, senior fellow at the center. She said recent job losses around the country also could catalyze a reduction in income taxes collected by states...
