In the new SAT, analyzing a scatterplot chart on Florida manatees and citing evidence to back up answers will trump knowledge of arcane vocabulary.
Draft questions released Wednesday by the College Board, owner of the entrance exam, illustrate the scope of the test’s first redesign since 2005. The new model, which will be implemented in 2016, aims to show students’ mastery of concepts taught in high school rather than measure skills and words they might rarely or never use in real life.
The SAT, which has been losing market share to competitor ACT Inc., is repositioning itself as an achievement test, using “real-world applications” of math, reading and science to identify students ready for college. In its initial unveiling of the overhaul last month, the College Board said the mandatory essay portion, added in 2005, will become optional, students will no longer be penalized for wrong answers and scoring will return to a scale of 1,600 from 2,400.
“This will be the first admission exam that requires students to cite evidence in support of their understanding of texts in both reading and writing,” College Board President David Coleman and Chief of Assessment Cynthia Schmeiser wrote in a letter accompanying Wednesday’s examples.
Every question will go through “extensive reviews and pretesting” to assure clarity and fairness, they said. Coleman acknowledged last month that students and their families are skeptical that the SAT and ACT reflect their best work.
Under the reading portion of the new test, students will be asked to analyze “relevant” words in context. One criticism of the current and previous tests has been the use of esoteric vocabulary that a typical 17-year-old test-taker wouldn’t use and acquired only through rote memorization. Sample questions from practice tests on the New York-based College Board’s website lists word choices including “sagacious,” “trenchant” and “raconteur.”
An example in the new test would have students reading the following passage: “The coming decades will likely see more intense clustering of jobs, innovation, and productivity in a smaller number of bigger cities and city-regions. Some regions could end up bloated beyond the capacity of their infrastructure, while others struggle, their promise stymied by inadequate human or other resources.”
They would then need to choose what the word “intense” most closely means: A) emotional, B) concentrated, C) brilliant, or D) determined. The only correct answer is B.
That sample question doesn’t appear much different from one on the current test, said Chris Falcinelli, founder of Focus Tutors & Test Prep, Brooklyn, N.Y. “Less esoteric and more relevant” means lower level and easier words, he said.
“Is it really bad that a kid might need to know the words ‘sanguine' or ‘redoubtable' in order to get an 800 on the Reading SAT?” Falcinelli said. “The vast majority of problems on the test don’t turn on the difficulty of the vocab. They turn on how well the tester has read them.”
The math section will measure problem solving and data analysis, including use of ratios, percentages and proportional reasoning.
One sample question is based on real-world methods used by the U.S. Fish and Wildlife Service to count the manatee, a sea mammal. Students would demonstrate understanding of a scatterplot chart of the manatee population off Florida by calculating the average yearly increase in the animal’s numbers.
The test will also promote what it calls founding documents, texts relevant to U.S. history or to “global conversations,” such as a 1974 speech by Congresswoman Barbara Jordan, delivered during impeachment hearings against President Richard Nixon, or President Abraham Lincoln’s Gettysburg Address.
One example shows how the word “dedicated” is used in different contexts in Lincoln’s 1863 speech.
“The Board is making a genuine effort to improve the test in the sense of making it more relevant to college work,” said Robert Sternberg, a psychologist and professor at Cornell University who has studied entrance exams. “To the extent that one wishes to predict freshman-year grades, it probably won’t make much difference because the SAT, ACT and all similar tests are really largely tests of general intelligence.”
Sternberg, who called the redesign compared with past revisions a difference of “light and heavy makeup,” said he would like to see testing companies measure skills such as creative thinking, practical thinking or ethical reasoning.
Last year, for the first time, the SAT lost ground to ACT Inc., in the number of test takers. ACT, based in Iowa City, Iowa, reported 1.8 million students in the class of 2013 took the test, an 8 percent increase from the previous year, and topping the 1.66 million students who took the SAT.
The ACT already has an optional essay and doesn’t penalize for guessing.
“We see where they are headed, and those are features very much associated with the ACT,” said Paul Weeks, ACT’s vice president of customer engagement. “We will let people draw their own conclusions.”
Many of the changes to the SAT, originally called the Scholastic Aptitude Test, were made to win back market share from ACT, or American College Test, said Bob Schaeffer, a spokesman for FairTest, a Boston-based nonprofit group critical of standardized testing.
“The changes may make the SAT appear more consumer- friendly, but they do not make it a better test,” Schaeffer said. “Most of the revisions are marketing bells and whistles.”
He said the 2016 SAT won’t be better in terms of predicting undergraduate performance accurately, fairly and without susceptibility to manipulation via coaching.
The two exams have different roots. The SAT, developed as an aptitude test by a Princeton University psychology professor, was introduced in 1926. Most of its early takers applied to private schools. The ACT, long promoted as an achievement test measuring what students learn in school, started in 1959 and was developed by a University of Iowa education professor.
Thirteen states pay for the ACT as a statewide assessment, and may contribute to the gains in that test’s popularity. Four states – Hawaii, Louisiana, Montana and Utah – began requiring the test in 2013 and Alabama made it mandatory this year, said Ed Colby, a spokesman for ACT, based in Iowa City, Iowa. Three more states begin next year, including Missouri and Wisconsin. The third hasn’t been announced.
ACT also said this week that 4,000 high school students took its test on a computer over the weekend, the first time a national undergraduate college admission exam had been administered that way. Testing via computer will initially be an option next year in schools that administer the exam to all students on a school day as part of a districtwide or statewide assessment, ACT said.