Summary of State Graduation Requirements

State graduation requirements around the country can be summarized as follows:

  • Credit/Course Completion Requirements. This is the most common graduation requirement.47 statesrequire students to pass a certain number of courses (and thus accumulate credits) that are aligned to state learning standards in prescribed disciplines to graduate with a state issued high school diploma.The determination as to whether a student has passed a course (or accumulated credit) in each stateis made at the school/local level. For the class of 2024, in30 states, course passage is the sole substantive requirement for graduation.Ten of those states do require students to take a civics exam before graduating.
  • High Stakes Exit Exams. 9 statesrequire students to take and pass anywhere between 2 and 5 subject matter exams in order to graduate with a state-issued diploma.The states (for the graduating class of 2024) areFlorida, Louisiana, Massachusetts, Ohio, New Jersey, New York, Texas, Virginia, and Wyoming.
  • End of Course Exams. 6 states(Georgia, Illinois, Maryland, Mississippi, Missouri and Tennessee) require students totakecertain statewide end-of-course exams in order to graduate.In some of these states the end-of-course exam must count for a certain percentage of the final course grade.A final passing grade in the course as determined at the school/local level (accumulation of the credit) is a diploma requirement.
  • Mastery/Proficiency/Career Readiness.In10 states,students have an additional requirement beyond course passage or an exit exam to earn a diploma.These requirements come in various forms of demonstrating competency, mastery, or college and career readiness.In most cases the state allows for multiple ways to meet the state requirement as designed and implemented by the local school district.
    • Locally designed and implemented.6 states.In several states the state has set a standard of mastery, competency, or graduation “readiness” and has asked local districts to devise their own way to meet this standard.In Connecticut, districts can choose their own assessments to measure students’ mastery-based learning.In Rhode Island, beginning with the class of 2028, localities need meet the state Readiness Based Graduation Requirement.In Pennsylvania districts rely on locally developed assessments to comply with state standards. In Vermont, students have to demonstrate proficiency in alocally delineated set of content knowledge and skills connected to state standards that, when supplemented with any additional locally developed requirements, have been determined to qualify a student for earning a high school diploma. The mode of assessment is locally determined.In Colorado and New Mexico, local districts can select from a menu of options, including capstones, AP or IB tests, or locally developed assessments to meet the readiness requirement.
    • State Competency Badges. 3 states.In Ohio, Indiana and Nevada, students must demonstrate a certain number of post-secondary readiness competencies from a menu of options and categories provided by the state and implemented locally.
    • ACT Passage with Specific Cutoff Score set by state. 1 state.Alabama has required that beginning with the graduating class of 2028, students will have to “pass” the ACT with a cutoff score to be determined by the state as an indicator of “college and career readiness.”

OVERWHELMING MAJORITY OF U.S. COLLEGES AND UNIVERSITIES REMAIN ACT/SAT OPTIONAL OR TEST-BLIND/SCORE-FREE FOR FALL 2025

A picture containing moon, astronomical object, darkness, celestial event
Description automatically generated

for further information:

Harry Feder, Esq. 917 273-8939              

Bob Schaeffer      239 699-0468


for immediate release, Wednesday, February 21, 2024
OVERWHELMING MAJORITY OF U.S. COLLEGES AND UNIVERSITIES

REMAIN ACT/SAT-OPTIONAL OR TEST-BLIND/SCORE-FREE FOR FALL 2025;

More than 80% of U.S. four-year colleges and universities will not require applicants for fall 2025 admissions to submit ACT/SAT scores according to a new tally by FairTest, the National Center for Fair & Open Testing. That’s a total of at least 1,825 of the nation’s bachelor-degree granting institutions, with more schools extending test-optional policies every week.

“Despite a media frenzy around a single Ivy League school reinstating testing requirements, ACT/SAT-optional and test-blind/score-free policies remain the ‘new normal’ in undergraduate admissions,” explained FairTest Executive Director Harry Feder. “Test-optional policies continue to dominate at national universities, state flagships, and selective liberal arts colleges because they typically result in more applicants, academically stronger applicants and more diversity.”

A recent study of ACT/SAT score submission by the Common Application group of 1000+ colleges and universities found “more and more students choosing not to report than to report. Growth is meaningfully faster over the past year for students not reporting test scores . . . “ (https://www.commonapp.org/files/Common-App-Deadline-Updates-2024.02.14.pdf)

FairTest Public Education Director Bob Schaeffer added, “High school students, parents, and counselors should understand that ACT/SAT scores will not be required by an overwhelming majority of undergraduate campuses for the foreseeable future. The introduction of the digital SAT will not change that reality because the revised test is not a better or fairer predictor of undergraduate success.” 

Among well-known institutions whose test-optional policies continue at least through fall 2025 are Columbia, Cornell, Emory, Harvard, Johns Hopkins, Notre Dame, Princeton, Stanford, the University of Chicago, and Vanderbilt. ACT/SAT scores will also not be required at such liberal arts colleges as Amherst, Middlebury, Swarthmore, Wesleyan, and Williams. Major public campuses in most states, including Colorado, Illinois, Missouri, Oregon, Utah, and Washington, remain test-optional for current high school seniors and juniors. The entire University of California and California State University systems are permanently test-blind/score-free.

FairTest’s frequently updated list of schools that do not require all or many applicants to submit ACT/SAT scores before admissions decisions are made is available free online at: https://fairtest.org/test-optional-list/

– – 3 0 – –

ACT/SAT-OPTIONAL & TEST-FREE UNDERGRADUATE ADMISSION BY THE NUMBERS

  1,075   ACT/SAT-optional schools pre-pandemic (as of March 15, 2020) 
  1,700   schools did not require ACT/SAT scores for fall 2020
  1,775   schools did not require ACT/SAT scores for fall 2021 
  1,825   schools did not require ACT/SAT scores for fall 2022 

  1,904   schools did not require ACT/SAT scores for fall 2023 

  2,025   schools did not require ACT/SAT scores for fall 2024

  1,825+ schools have already extended ACT/SAT-optional or test-blind/score-free admissions 

               through at least fall 2025. About 200 schools have not yet announced their policies 

   for the next admissions cycle — more are extending every week.                

  1,700+ of these schools are “permanently” ACT/SAT-optional or test-blind/score-free

  2,278   total number of 4-year schools per USDOE National Center for Education Statistics                          https://nces.ed.gov/programs/coe/indicator/csa                     

The Misguided War on Test Optional

Akil Bello calls out the backlash against test optional policies in the New York Times for what it is–a misguided campaign to preserve the veneer of elitism at Ivy Plus institutions through a gatekeeping mechanism that overwhelmingly favors wealth and privilege by those who bask in that status.

https://www.insidehighered.com/opinion/views/2024/02/05/misguided-war-test-optional-opinion

Innovative Educational Assessments that Support Deeper Learning

FairTest explores efforts at the state and local level to assess student learning in more educationally beneficial and equitable ways other than federally and state mandated standardized tests. There are robust examples of systems of authentic assessments that can be used as models for spurring deeper learning.

https://fairtest.org/wp-content/uploads/2023/12/PBA-Advocacy-Document.docx.pdf

Interpreting PISA Results: It’s Poverty, Stupid (With a Bit of the iPhone)

The results of PISA 2022 should, like all standardized test results, be filtered through a dose of skepticism about the claims of the test producers and administrators. We must also carefully scrutinize “Chicken Little” claims in the media which is notorious for manufacturing and hyping education crises. Declines in standardized test scores have been the premise for all of the failed education reforms of the past forty years, from the publication of A Nation at Risk, through No Child Left Behind, the charter school movement, and now universal vouchers and privatization. We must guard against this trap yet again.

By scrutinizing the performance of the United States versus other OECD countries, the unshocking conclusion should be that the PISA test is largely a measure of childhood poverty rates rather than academic achievement.  The United States leads the OECD in child poverty. Our rate of child poverty is approximately 26%, and higher by some measures. Thus it is not surprising that as a nation we do not perform as well as most other OECD countries on PISA. If you compare the tranche of American schools with poverty rates equal to those of other OECD countries, however, the United States does quite well.

In reading, countries at the top of the PISA list– Slovenia (499), Denmark (489), and Finland (474)–all have childhood poverty rates below 10%. If one were to measure US schools with under 10% poverty rates, the average score would be 562, good enough for first globally. In mathematics, Germany, France and the UK have child poverty rates between 15 and 18% and have scores of 475, 479 and 489 respectively.  If you measured US schools with childhood poverty rates of 10-25%, we would score a 508, good enough for 4th in the OECD.

America’s problem on PISA is poverty and inequality, not curriculum and instruction. 

PISA is a scaled score, norm-referenced, multiple-choice test. Two-thirds of all test takers globally score between 400 and 600 on a section (math, reading and science). Only 2% score over 700.  In general, these kinds of tests are set up so results will go down over a longer time frame. According to Prof. Andy Hargeaves of Boston College, “once a metric is widely used and has a competitive ranking element, gaming the system leads to overall declines in performance after an early lift, and also has negative side-effects on well being.”  Not surprisingly, during the last two decades student performance in mathematics, reading and science all significantly declined in most OECD countries.

If one were looking for an actual reason for this decline besides test design and use, the proliferation of technology and handheld device usage by 15-year olds may very well be the culprit. In response to a question on the 2022 PISA, 45% of students said they felt anxious if their phone was not near them and 65% felt distracted by them during math lessons. Prof. Sam Abrams of Teachers College, Columbia University attributes Finland’s decline in PISA scores to the introduction of the iPhone and its proliferation of use among Finnish teens.

As an aside, the United States did not do terribly relative to other OECD countries in terms of rankings. The US moved up in rankings for all three subjects (math, reading and science). And in aggregate scores the United States held its ground in reading and science pretty well from the previous administration while other countries’ scores went down.

The OECD is to be commended for attempting to analyze the extent creativity and innovation are promoted in national school systems. However, the Creativity and Innovation review did not include the United States. The PISA team found it hard to extract data about the state of creative thinking in schools in the US, because education is delegated via states to many often small districts of schools. They did cite the work of EL Education’s network of districts and public schools – a network committed to assessment via performance-based assessment– as an example of creative and innovative schooling designed to get students to think critically, communicate clearly, and create complex work. Overall the report stated that within the limitations of a snapshot review “it has not been possible to do justice to the rich variety of experiences in schools in the USA.”  

Successful attempts have been made to measure creativity, innovation and entrepreneurship in the global economy by nation.  Perhaps not surprisingly, the United States ranks near the top in those economic categories among G20 nations and has for decades.  There is a disconnect between our international testing rankings and our economic ranking based on human capital. This calls into question whether the United States should be worried about its PISA rankings at all.   Does PISA measure anything of importance?  Not really, but you wouldn’t know it from the weight policy makers and media ascribe to the results.

Tienken and Mullen (2014) found no statistically significant relationships between indicators associated with the innovation economy and PISA. Earlier studies of PISA results suggest no statistically significant relationships or weak relationships between ranks on international tests and economic output indicators such as GDP, adjusted gross income, or purchasing power parity (e.g., Baker, 2007; Rameriz, et al, 2006; Tienken, 2008). International tests do not provide meaningful information about the skills most important for full participation in society in terms of socio-civic and economic outcomes in the G20 countries (Sjoberg, 2007). The information tested on international tests is not the information children will need to compete in the innovation economy and the results do not tell us important information about student proficiency with the vocational competencies necessary to create, innovate, and pursue entrepreneurial opportunities (Zhao, 2014). (See citation below). Perhaps the correct answer is that we really shouldn’t be paying much attention to PISA results. They don’t give us particularly useful or telling information.

Finally, given the deep dislocation in schooling and trauma caused by the COVID pandemic, drops in scores from the pre-COVID administration are not surprising. The PISA scores are some evidence of what we already knew–the pandemic was bad for kids everywhere and impacted learning.

Tienken, C.H. & Mullen, C.A. (2014). The curious case of international student assessment: Rankings and realities in the innovation economy. In S. Harris & J. Mixon (Eds.), Building cultural community through global educational leadership (pp.146-164). Ypsilanti, MI: NCPEA Press

FairTest on ABC News

Executive Director Harry Feder discusses the deep problems of relying on standardized tests for admissions for both equity and talent development, the reasons for test optional and free policies, and the examination of student work and performance as part of a holistic admissions process.

ABC News clip


Performance Assessment for Parents and Policy Makers

Performance-based assessments are a superior way of evaluating student knowledge and skills. When done well, they are teacher created and externally validated, arise out of classroom practice, aligned to learning standards, and authentic to student learning. We have created this Fact Sheet for those interested in advocating for and supporting quality performance-based assessments instead of state and local evaluation systems based on standardized testing (as opposed to using tests as an occasional “education thermostat” in the manner of NAEP).

What are Performance Based Assessments?

Performance Based Assessments require student completion of a task by applying knowledge, skills and work habits. In undertaking the task students perform or produce something that demonstrates that they have mastered specific skills and competencies.

Assessment tasks may involve:

  • Producing a product

  • Performing an activity

  • Reporting on an investigation

Students demonstrate understanding and apply knowledge by:

  • Explaining processes and methodologies

  • Solving problems

  • Explaining phenomena

  • Forming hypotheses

  • Answering questions

  • Conducting inquiries

  • Creating original work

Some examples include

  • Solving a non-routine math problem and explaining and defending the methodology

  • Designing and conducting an experiment to address a real-world hypothesis

  • Investigating, researching and answering a debatable historical question

  • Creating a podcast about an issue affecting the school

  • Conducting a trial of a literary character

  • Talking about a novel with an adult

  • Engaging in a conversation in a foreign language

  • Producing an original work of art and having it critiqued

  • Interviewing members of a community about a public health issue

  • Designing and building a costume for a school musical

Done well, performance-based assessments

  • Grow out of the curriculum; they are not imposed on curriculum

  • Are given after students and teachers have thoroughly studied and debated material

  • Are part of a more meaningful and engaging learning process

  • Are open ended and relevant in the real world

  • Require application and transfer of learning

  • Are fair and culturally responsive

  • Outline clear criteria for success

  • Result in original products or success

How do Performance Based Assessments work in a school?

Teachers learn to design, validate, and implement high-quality performance assessments and reliably score resulting student work through professional learning. Rubrics and feedback modules are designed to give students information about the substance and quality of the work and for avenues of improvement.

Like in the real world, students have the opportunity to practice and revise their work – subject to feedback and collaborative processes – before final performance or submission.

Multi-step assignments measure how well a student transfers and applies knowledge and complex skills. Students demonstrate proficiency in ways that will be expected of them later in college, career, and life.

Performance assessments in the classroom that are integrated into students’ daily work, drive quality instruction, and assess student growth over time.

Why are Performance Based Assessments good for student learning?

Performance Based Assessments allow students to explain their thinking, evaluate ideas, investigate and research their own questions, and truly demonstrate learning. Performance Based Assessments allow all students to demonstrate what they know and can do through real-world application, preparing graduates for college, career, and beyond.

Why are Performance Based Assessments good public policy, better than traditional standardized tests?

Dependence on standardized testing in K-12 education to assess student knowledge and skills is outdated for the complex higher order tasks of the modern world. They are not “authentic” measurements of progress – they are divorced from the daily inquiry-based processes and student-teacher and peer interactions of the daily classroom. Because of their inherent limitations, biases, time constraints, and stakes, they are harmful to our most vulnerable students. Students’ futures should not be based on simply a 1-3 day test; rather, assessing students and their futures should be based on a more holistic approach to more accurately reflect students’ abilities.

Standardized tests negatively impact teaching and learning by limiting and channeling the learning process to a single commercially produced instrument. They narrow curriculum as “teaching to the test” becomes the standard practice; they devalue teacher professionalism by calling into question the capacity of those with daily engagement with students to evaluate student capacity and progress; and they in the aggregate they warp public perception of school quality by reducing school evaluation to a few test score data points. On the whole, standardized tests are a mediocre and misleading way to capture student performance or teacher effectiveness.

High-stakes Standardized Testing

X Narrows the curriculum

X Devalues teachers

X Misinforms the public about school quality

Performance Based Assessments

✔Grow out of curriculum

✔Professionalize teaching

✔Accurately reflect schoolquality and culture

How do Performance Based Assessments provide transparency and accountability?

Teacher-generated, curriculum-embedded performance assessments create a richer means of assessing student achievement and serve as the primary student learning measure. When examples and demonstrations of student work are made public and presented to the community they provide genuine accountability to major stakeholders. The accountability is more genuine than standardized testing as the work is an authentic reflection of the actual schooling and not a mere proxy for class and, all too often, race.

Are Performance Based Assessments valid and reliable evaluations of student learning?

Provided that exhibitions and demonstrations are available for community and public inspection, assessments are judged according to clearly defined task appropriate standards and rubric, and tasks are aligned to learning goals and standards, Performance Based Assessments are valid and reliable evaluations of student learning. They more validly and reliably assess the panoply of factors and skills that are important to student learning and development than standardized tests.

Digging into Testing Policies

Takeaways: Test Optional is Real

Getting admitted to college without a test score (and getting scholarships) is a realistic possiblility. There are colleges at every selectivity level, in every state, that have enrolleda significant portion of their class without test scores having been submitted. This means that for each family, you have a choice whether to participate in the testing and test prep rat race . . . or not.

Whether you(or your child/student/friend) should test or not is an individual decisions based on a several factors.

  • Institutional Policy
    You should consider the policies of the schools in which you are most interested (CalTech is test free – don’t test. Georgetown is test required – you have to test if you want to apply).
  • History
    You should also consider the percent of students enrolling with and without scores in the past few years (we’ll be adding the percents toour listin the next few weeks.)
  • Access to and Energy for Prep
    Consider how much time and energy it might take you to get a competitive score. If you’re 300 SAT points or 12 ACT pointsfrom the 25th percentile of the college you’re interested in, that will likely require a lot of prep (maybe 60 total prep hours over 6 months) to get the score you’re looking for. What will it cost? Is it worth it?

Digging into the Data

A key question in examining the impact of test optional policies is how many students are actually ending up in college having not submitted scores. We’ve endeavored to answer that question by looking at the data colleges have submitted to the federal government (for college data nerds – IPEDs).

There are 6,500+ institutions that the federal government considers colleges, but for our purposes we only downloaded information on U. S. institutions that have first-time full-time students, award at least a bachelor’s degree (so this excludes many great community colleges), have an enrollment of more than 100 students, accepted at least 1 student in the last reporting cycle, and are not entirely online. Because of these parameters we ended up with data for 2,156 colleges (FYI this number varies a little year to year but this is typically the pool of colleges that FairTest uses as the universe of “colleges”).

Here are some important points about the data: 

  • Until the 2022 reporting period, federal categories for reporting test policies created some confusion. Colleges had to select their test policies from “required,” “neither required nor recommended,” “considered if submitted,” and “recommended.” Some colleges interpreted “neither required nor recommended” as test free (does not consider scores at all) while some didn’t. So the 2021 analysis might actually undercount the number of test free and test optional colleges. In 2022, those categories were changed to “required to be considered for admission,” “not required for admission, but considered if submitted (aka “Test Optional”), “not considered for admission, even if submitted (Test Blind).” This simplified and clarified the categories (and aligned them with how FairTest reports policies in our database).  

  • Prior to 2021, schools indicating that scores were neither required nor recommended were not allowed to report the percent of enrolled students submitting scores. Of the 2,274 colleges we evaluated, only 1017 colleges reported the percent of enrolled first year students submitting test scores.  This is almost half the total, thus constituting a significant  significant sample. 

  • Not all colleges report testing policies (open admission colleges do not report testing policies). In 2021, of 2,274 colleges we evaluated, 1683 colleges reported test score policies to the Federal Government.  In 2022, of the 2,156 colleges we evaluated 1,628 reported their testing policies. 

The data is for the enrolling class of 2022, which was the first year “post-pandemic.” We’ve compared this data to past years to so we can see changes and trends. 

Number of Institutions by Percent of Scores Submitted

Since 2018 the percent of enrolled students submitting scores has been dropping everywhere. At the vast majority of colleges in 2018 all students submitted the ACT, the SAT, or both. In 2022 at less than 10% of colleges did all enrolled students submit scores. At about half of colleges in 2022 less than half of all enrolled students submitted any test score at all.


% of SAT + ACT scores submitted by enrolled students

Number of institutions in 2022

2021

2020

2019

2018

100% and above

66

69

627

872

868

75 - 99%

170

211

357

291

305

50 - 74%

294

282

93

49

53

25 - 49%

319

305

25

18

15

0 - 24%

737

144

9

5

7

Percent of Scores Submitted by Selectivity

On average, more selective schools have a higher rate of scores submission. While we cannot be certain of causation, a likely hypothesis is that more selective schools attract a socio-economically advantaged pool of applicants more likely to be able to afford test prep and are counseled to submit scores. This statistic does not mean that the most selective schools prefer scores (although vested interests like the testing agencies and tutoring companies push that narrative). It also doesn’t mean scores make it easier to get in or that scores must be submitted. For example, only 51% of enrolled students submitted scores to Barnard and only 48% to Babson.

Another caveat to this data is that not all schools are required to report scores depending on their policies. Some colleges entered 0% test scores submitted when they weren’t required to submit data while others left the field blank; our analysis didn’t try to sort out the meanings behind these entries. For our analysis we just calculated the average across all colleges.

Here are our finding by selectivity:


Average percent of SAT + ACT submitted in 2022

2021

All colleges

34%

56%

Colleges admitting less than 50% of applicants

44%

64%

200 colleges with the lowest percent of admitted students

45%

65%

100 colleges . . .

52%

65%

50 colleges . . .

57%

70%

25 colleges . . .

64%

75%

Selected Schools

A few interesting things we noticed:

  • In 2018, if you added the submitted SAT scores and submitted ACT scores for each college and then averaged that across all colleges the average was 104%. Meaning on average colleges got scores for 104% of enrolled students in 2018. In 2021, that average was 56%.
  • – At University of Florida (a state that required test scores), scores are reported for 125% of enrolling students. At the military academies scores are also reported for more than 100% of students (meaning that some students are submitting both).
  • – Required doesn’t mean required (or there are data errors). At Georgia Tech in 2021, where scores were/are required only 89% of enrolling students submitted scores. This might be a result of the change in policy happening in the middle of the admission cycle.
  • – Admission rate only loosely correlates to submission rates. At NYU only 37% submitted scores, but at Haverford (which is less highly selective), 62% submitted. There is a lot of variance among colleges in each of the “selectivity” categories.
School2022 Admissions rateSum of SAT/ACT submitted2022 PolicyPercent of SAT submitted 2022 enrolled classPercent of ACT submitted 2022 enrolled class
Dartmouth College6%66%Not required for admission, but considered if submitted (Test Optional)42%24%
Princeton University6%84%Not required for admission, but considered if submitted (Test Optional)59%25%
Northeastern University7%44%Not required for admission, but considered if submitted (Test Optional)33%11%
Swarthmore College7%61%Not required for admission, but considered if submitted (Test Optional)42%19%
Northwestern University7%78%Not required for admission, but considered if submitted (Test Optional)47%31%
Williams College8%62%Not required for admission, but considered if submitted (Test Optional)41%21%
Barnard College9%50%Not required for admission, but considered if submitted (Test Optional)30%20%
Emory University11%65%Not required for admission, but considered if submitted (Test Optional)41%24%
United States Naval Academy11%120%Required to be considered for admission74%46%
New York University12%37%Not required for admission, but considered if submitted (Test Optional)26%11%
University of Southern California12%50%Not required for admission, but considered if submitted (Test Optional)34%16%
Georgetown University12%108%Required to be considered for admission72%36%
Boston University14%35%Not required for admission, but considered if submitted (Test Optional)23%12%
Haverford College14%62%Not required for admission, but considered if submitted (Test Optional)43%19%
Washington and Lee University17%54%Not required for admission, but considered if submitted (Test Optional)28%26%
Georgia Institute of Technology-Main Campus17%112%Required to be considered for admission74%38%
Vassar College19%51%Not required for admission, but considered if submitted (Test Optional)33%18%
Franklin W Olin College of Engineering19%72%Not required for admission, but considered if submitted (Test Optional)48%24%
University of California-Irvine21%0%Not considered for admission, even if submitted (Test Blind)0%0%
Babson College22%40%Not required for admission, but considered if submitted (Test Optional)32%8%
University of Florida23%122%Required to be considered for admission81%41%
United States Coast Guard Academy24%106%Not required for admission, but considered if submitted (Test Optional)68%38%
The University of Texas at Austin31%85%Not required for admission, but considered if submitted (Test Optional)63%22%
Trinity College36%19%Not required for admission, but considered if submitted (Test Optional)14%5%
University of Rochester39%39%Not required for admission, but considered if submitted (Test Optional)28%11%
Binghamton University42%66%Not required for admission, but considered if submitted (Test Optional)54%12%
University of Illinois Urbana-Champaign45%64%Not required for admission, but considered if submitted (Test Optional)44%20%
Sarah Lawrence College50%21%Not required for admission, but considered if submitted (Test Optional)15%6%
Howard University53%44%Not required for admission, but considered if submitted (Test Optional)31%13%
Ohio State University-Main Campus53%55%Not required for admission, but considered if submitted (Test Optional)15%40%
Rhodes College54%42%Not required for admission, but considered if submitted (Test Optional)11%31%
Texas Christian University56%38%Not required for admission, but considered if submitted (Test Optional)18%20%
Tennessee Wesleyan University61%86%Not required for admission, but considered if submitted (Test Optional)10%76%
Marist College63%22%Not required for admission, but considered if submitted (Test Optional)18%4%
Norfolk State University89%21%Not required for admission, but considered if submitted (Test Optional)16%5%

What about merit aid?

This past spring FairTest issued a report on how frequently the SAT and ACT play a role in merit scholarship programs. The findings were pretty surprising. Tests don’t play as big a role in scholarships as most people believe.

You can read the full report,“Merit” Awards: Myths, Realities, & Barriers to Access, yourself for details, but until you do here are some highlights.

State-funded Merit Scholarships

First, we looked at state-funded scholarship programs.

Most state-funded programs were “specific purprose,” meaning they were only open to specific groups like children of veterans, children in foster care, or students pursuing particular majors. These didn’t require tests scores and often didn’t have GPA requirements.

Of the scholarships that required scores or GPA the vast majority had a GPA requirement.The key finding was thatonly 16% of the 353 state-funded scholarshipswe looked at required a test scores.

Merit Scholarships at Flagship Universities. 

We also evaluated merit aid at 51″flagship” colleges. Again, our findings were pretty surprising to us. Tests didn’t play a big role in flagship merit scholarships. 

As you can see from Table 2 in the “Merit” Awards: Myths, Realities, & Barriers to Access report, less than 1 in 3  scholarships required a test score in order to apply for that award. 

FairTest October 2 Gala

We honored Ann Cook, longtime educator and champion of performance-based assessment and founder of the New York Performance Standards Consortium, and Cong. Jamaal Bowman, former principal and advocate for ESSA testing reform in the House as 2023 Deborah Meier Heroes of Education. It was a great night! Thanks to all who came and contributed.

In her remarks, Ann looked back on the accomplishments of the New York Performance Standards Consortium which over the course of 25 years has established the premier system for graduating students using performance based assessments instead of standardized tests. She warned of “test creep” in the form of “interim” and “through-year” assessments and how maintaining a test-based culture is antithetical to an assessment system that is authentic to deeper student learning and lets kids demonstrate what they can really do. The full text of her remarks can be read here.

Why Standardized Tests Are a Bad Way to Do College Admissions

Executive Director Harry Feder on Spectrum1 News in Southern California explaining the origins and staying power of test optional policies and why the SAT?ACT is an invalid instrument to determine college admissions.

Watch beginning at 35:05

https://spectrumnews1.com/ca/la-west/news/2023/09/20/csu-tuition-hike?cid=id-app15_m-share_s-web_cmp-app_launch_august2020_c-producer_posts_po-organic#